Basic part: Java. Stream function, elegant data stream operation

cscw 2021-06-23 22:25:45
basic java. java stream function


Preface

Usually operate the set data , We are usually for perhaps iterator To traverse the , It's not very nice .java Provides Stream The concept of , It allows us to treat aggregate data as elements , And it provides multithreading mode

  • Stream creation
  • Various data operations of streams
  • Termination of stream
  • Aggregation of streams
  • Concurrent flows and CompletableFuture In combination with

Official account , Communicate together , Search on wechat : Sneak forward

github Address , thank star

1 stream Mode of construction

stream Built in constructors

public static<T> Stream<T> iterate(final T seed, final UnaryOperator<T> f)
public static <T> Stream<T> concat(Stream<? extends T> a, Stream<? extends T> b)
public static<T> Builder<T> builder()
public static<T> Stream<T> of(T t)
public static<T> Stream<T> empty()
public static<T> Stream<T> generate(Supplier<T> s)

Collection Declarative stream function

default Stream<E> stream()
  • Collection The statement stream Transformation function , in other words , arbitrarily Collection Subclasses are officially implemented for us by Collection To Stream Methods
  • Example ,List turn Stream

    public static void main(String[] args){
    List<String> demo = Arrays.asList("a","b","c");
    long count = demo.stream().peek(System.out::println).count();
    System.out.println(count);
    }
    -------result--------
    a
    b
    c
    3

2 Interface stream The operation method definition of the element

Filter filter

Stream<T> filter(Predicate<? super T> predicate)
  • Predicate It's a functional interface , It can be used directly lambda Instead of ; If there's complicated filtering logic , Then use or、and、negate Method combination
  • Example

    List<String> demo = Arrays.asList("a", "b", "c");
    Predicate<String> f1 = item -> item.equals("a");
    Predicate<String> f2 = item -> item.equals("b");
    demo.stream().filter(f1.or(f2)).forEach(System.out::println);
    -------result--------
    a
    b

Mapping transformation map

<R> Stream<R> map(Function<? super T, ? extends R> mapper)
IntStream mapToInt(ToIntFunction<? super T> mapper);
LongStream mapToLong(ToLongFunction<? super T> mapper);
DoubleStream mapToDouble(ToDoubleFunction<? super T> mapper);
  • Example

    static class User{
    public User(Integer id){this.id = id; }
    Integer id; public Integer getId() { return id; }
    }
    public static void main(String[] args) {
    List<User> demo = Arrays.asList(new User(1), new User(2), new User(3));
    // User To Integer(id)
    demo.stream().map(User::getId).forEach(System.out::println);
    }
    -------result--------
    1
    2
    3

    Data processing peek

    Stream<T> peek(Consumer<? super T> action);
  • And map The difference is that it has no return value
  • Example

    static class User{
    public User(Integer id){this.id = id; }
    Integer id;
    public Integer getId() { return id; }
    public void setId(Integer id) { this.id = id; }
    }
    public static void main(String[] args) {
    List<User> demo = Arrays.asList(new User(1), new User(2), new User(3));
    // id square ,User To Integer(id)
    demo.stream().peek(user -> user.setId(user.id * user.id)).map(User::getId).forEach(System.out::println);
    }
    -------result--------
    1
    4
    9

Mapping flattens flatMap

<R> Stream<R> flatMap(Function<? super T, ? extends Stream<? extends R>> mapper);
IntStream flatMapToInt(Function<? super T, ? extends IntStream> mapper);
LongStream flatMapToLong(Function<? super T, ? extends LongStream> mapper);
DoubleStream flatMapToDouble(Function<? super T, ? extends DoubleStream> mapper);
  • flatMap: Put the element as Stream\<T> The flow of a type is equal to an element type of T Of Stream flow
  • Example

    public static void main(String[] args) {
    List<Stream<Integer>> demo = Arrays.asList(Stream.of(5), Stream.of(2), Stream.of(1));
    demo.stream().flatMap(Function.identity()).forEach(System.out::println);
    }
    -------result--------
    5
    2
    1

duplicate removal distinct

Stream<T> distinct();
  • Example

    List<Integer> demo = Arrays.asList(1, 1, 2);
    demo.stream().distinct().forEach(System.out::println);
    -------result--------
    1
    2

Sort sorted

Stream<T> sorted();
Stream<T> sorted(Comparator<? super T> comparator);
  • Example

    List<Integer> demo = Arrays.asList(5, 1, 2);
    // Default ascending order
    demo.stream().sorted().forEach(System.out::println);
    // Descending
    Comparator<Integer> comparator = Comparator.<Integer, Integer>comparing(item -> item).reversed();
    demo.stream().sorted(comparator).forEach(System.out::println);
    ------- Default ascending order result--------
    1
    2
    5
    ------- Descending result--------
    5
    2
    1

Number limit limit And skip skip

// Before interception maxSize Elements
Stream<T> limit(long maxSize);
// Skip the former n A flow
Stream<T> skip(long n);
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6);
    // Skip the first two , Then limit the interception of two
    demo.stream().skip(2).limit(2).forEach(System.out::println);
    -------result--------
    3
    4

JDK9 New operations provided

  • and filter The difference between ,takeWhile It's the element that satisfies the condition , Until I'm not satisfied ;dropWhile It's discarding elements that satisfy the condition , Until I'm not satisfied

    default Stream<T> takeWhile(Predicate<? super T> predicate);
    default Stream<T> dropWhile(Predicate<? super T> predicate);

3 stream Termination of action

Traverse consumption

// Traverse consumption
void forEach(Consumer<? super T> action);
// Order traversal consumption , and forEach Is the difference between the forEachOrdered In a multithreaded parallelStream perform , The order will not be out of order
void forEachOrdered(Consumer<? super T> action);
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3);
    demo.parallelStream().forEach(System.out::println);
    demo.parallelStream().forEachOrdered(System.out::println);
    -------forEach result--------
    2
    3
    1
    -------forEachOrdered result--------
    1
    2
    3

Get array results

// Transfer into Object Array
Object[] toArray();
// Transfer into A[] Array , Specify the type A
<A> A[] toArray(IntFunction<A[]> generator)
  • Example

    List<String> demo = Arrays.asList("1", "2", "3");
    //<A> A[] toArray(IntFunction<A[]> generator)
    String[] data = demo.stream().toArray(String[]::new);

Max min

// Get the minimum
Optional<T> min(Comparator<? super T> comparator)
// Get the maximum
Optional<T> max(Comparator<? super T> comparator)
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3);
    Optional<Integer> min = demo.stream().min(Comparator.comparing(item->item));
    Optional<Integer> max = demo.stream().max(Comparator.comparing(item->item));
    System.out.println(min.get()+"-"+max.get());
    -------result--------
    1-3

Find a match

// Any match
boolean anyMatch(Predicate<? super T> predicate)
// All match
boolean allMatch(Predicate<? super T> predicate)
// Mismatch
boolean noneMatch(Predicate<? super T> predicate)
// find first
Optional<T> findFirst();
// Any one
Optional<T> findAny();

To merge

// A merger of two
Optional<T> reduce(BinaryOperator<T> accumulator)
// A merger of two , With initial values
T reduce(T identity, BinaryOperator<T> accumulator)
// First transform the element type, and then merge them in pairs , With initial values
<U> U reduce(U identity, BiFunction<U, ? super T, U> accumulator, BinaryOperator<U> combiner)
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8);
    // Numbers are converted into strings , And then use “-” Splice up
    String data = demo.stream().reduce("0", (u, t) -> u + "-" + t, (s1, s2) -> s1 + "-" + s2);
    System.out.println(data);
    -------result--------
    0-1-2-3-4-5-6-7-8

Count the number of elements

long count()
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6);
    System.out.println(demo.stream().count());
    -------result--------
    6

The aggregation of convection

/**
* supplier: Returns the producer of the result type
* accumulator: Element consumers ( Process and add R)
* combiner: Return results R How to combine ( When multithreading is executed , Multiple return values will be generated R, Need merger )
*/
<R> R collect(Supplier<R> supplier, BiConsumer<R, ? super T> accumulator, BiConsumer<R, R> combiner);
/**
* collector It's usually by supplier、accumulator、combiner、finisher、characteristics Aggregate classes that are composed of
* Collectors Some built-in aggregation classes or methods can be provided
*/
<R, A> R collect(Collector<? super T, A, R> collector);
  • Example , Look below

4 Collector( Aggregate class ) Tool class set of Collectors

Interface Collector And implementation classes CollectorImpl

// The producer of the return value type
Supplier<A> supplier();
// Stream element consumers
BiConsumer<A, T> accumulator();
// Return value combiner ( When multiple threads operate , Multiple return values will be generated , Need merger )
BinaryOperator<A> combiner();
// Return value converter ( The last step is to deal with , The actual return result , Usually return as is )
Function<A, R> finisher();
// The nature of flow
Set<Characteristics> characteristics();
public static<T, A, R> Collector<T, A, R> of(Supplier<A> supplier,
BiConsumer<A, T> accumulator, BinaryOperator<A> combiner,
Function<A, R> finisher, Characteristics... characteristics)

Stream aggregation transforms into List, Set

// Flow is transformed into List
public static <T> Collector<T, ?, List<T>> toList()
// Flow is transformed into Set
public static <T> Collector<T, ?, Set<T>> toSet()
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3);
    List<Integer> col = demo.stream().collect(Collectors.toList());
    Set<Integer> set = demo.stream().collect(Collectors.toSet());

Stream aggregation transforms into Map

// Flow is transformed into Map
public static <T, K, U> Collector<T, ?, Map<K,U>> toMap(
Function<? super T, ? extends K> keyMapper,
Function<? super T, ? extends U> valueMapper)
/**
* mergeFunction: same key, How to combine values
*/
public static <T, K, U> Collector<T, ?, Map<K,U>> toMap(
Function<? super T, ? extends K> keyMapper,
Function<? super T, ? extends U> valueMapper,
BinaryOperator<U> mergeFunction)
/**
* mergeFunction: same key, How to combine values
* mapSupplier: Return value Map The producers of
*/
public static <T, K, U, M extends Map<K, U>> Collector<T, ?, M> toMap(
Function<? super T, ? extends K> keyMapper,
Function<? super T, ? extends U> valueMapper,
BinaryOperator<U> mergeFunction,
Supplier<M> mapSupplier)
  • If there is the same key The elements of , Will report a mistake ; Or use groupBy
  • Example

    List<User> demo = Arrays.asList(new User(1), new User(2), new User(3));
    Map<Integer,User> map = demo.stream().collect(Collectors.toMap(User::getId,item->item));
    System.out.println(map);
    -------result-------
    {1=TestS$User@7b23ec81, 2=TestS$User@6acbcfc0, 3=TestS$User@5f184fc6}

String stream aggregation splicing

// Multiple strings are spliced into a string
public static Collector<CharSequence, ?, String> joining();
// Multiple strings are spliced into a string ( Specify the separator )
public static Collector<CharSequence, ?, String> joining(CharSequence delimiter)
  • Example

    List<String> demo = Arrays.asList("c", "s", "c","w"," Sneak forward ");
    String name = demo.stream().collect(Collectors.joining("-"));
    System.out.println(name);
    -------result-------
    c-s-c-w- Sneak forward 

Stream mapping reprocessing

  • It's equivalent to first map Again collect

    /**
    * mapper: Mapping processor
    * downstream: After mapping, it needs to be aggregated again
    */
    public static <T, U, A, R> Collector<T, ?, R> mapping(Function<? super T, ? extends U> mapper,
    Collector<? super U, A, R> downstream);
  • Example

    List<String> demo = Arrays.asList("1", "2", "3");
    List<Integer> data = demo.stream().collect(Collectors.mapping(Integer::valueOf, Collectors.toList()));
    System.out.println(data);
    -------result-------
    [1, 2, 3]

Aggregate and then convert the result

/**
* downstream: Aggregate processing
* finisher: Result conversion processing
*/
public static<T,A,R,RR> Collector<T,A,RR> collectingAndThen(Collector<T,A,R> downstream,
Function<R, RR> finisher); 
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 3, 4, 5, 6);
    // Polymerization List, Finally, extract the array of size As return value
    Integer size = demo.stream().collect(Collectors.collectingAndThen(Collectors.toList(), List::size));
    System.out.println(size);
    ---------result----------
    6

Stream grouping (Map yes HashMap)

/**
* classifier Appoint T Type a property as Key Value grouping
* After grouping , Use List As a container for each stream
*/
public static <T, K> Collector<T, ?, Map<K, List<T>>> groupingBy(
Function<? super T, ? extends K> classifier);
/**
* classifier: Stream packet
* downstream: Aggregate processor for each stream
*/
public static <T, K, A, D> Collector<T, ?, Map<K, D>> groupingBy(
Function<? super T, ? extends K> classifier,
Collector<? super T, A, D> downstream)
/**
* classifier: Stream packet
* mapFactory: Return value map Our factory (Map Subclasses of )
* downstream: Aggregate processor for each stream
*/
public static <T, K, D, A, M extends Map<K, D>> Collector<T, ?, M> groupingBy(
Function<? super T, ? extends K> classifier,
Supplier<M> mapFactory,
Collector<? super T, A, D> downstream)
  • Example

    public static void main(String[] args) throws Exception {
    List<Integer> demo = Stream.iterate(0, item -> item + 1)
    .limit(15)
    .collect(Collectors.toList());
    // Divide into three groups , And each group of elements is transformed into String type
    Map<Integer, List<String>> map = demo.stream()
    .collect(Collectors.groupingBy(item -> item % 3,
    HashMap::new,
    Collectors.mapping(String::valueOf, Collectors.toList())));
    System.out.println(map);
    }
    ---------result----------
    {0=[0, 3, 6, 9, 12], 1=[1, 4, 7, 10, 13], 2=[2, 5, 8, 11, 14]} 

Stream grouping ( In groups Map yes ConcurrentHashMap)

/**
* classifier: Grouping device ; After grouping , Use List As a container for each stream
*/
public static <T, K> Collector<T, ?, ConcurrentMap<K, List<T>>> groupingByConcurrent(
Function<? super T, ? extends K> classifier);
/**
* classifier: Grouping device
* downstream: Stream aggregation processor
*/
public static <T, K, A, D> Collector<T, ?, ConcurrentMap<K, D>> groupingByConcurrent(
Function<? super T, ? extends K> classifier, Collector<? super T, A, D> downstream)
/**
* classifier: Grouping device
* mapFactory: return type map The production plant of (ConcurrentMap Subclasses of )
* downstream: Stream aggregation processor
*/
public static <T, K, A, D, M extends ConcurrentMap<K, D>> Collector<T, ?, M> groupingByConcurrent(
Function<? super T, ? extends K> classifier,
Supplier<M> mapFactory,
Collector<? super T, A, D> downstream);
  • Usage and groupingBy equally

Split flow , One to two ( Equivalent to special groupingBy)

public static <T> Collector<T, ?, Map<Boolean, List<T>>> partitioningBy(
Predicate<? super T> predicate)
/**
* predicate: Dichotor
* downstream: Stream aggregation processor
*/
public static <T, D, A> Collector<T, ?, Map<Boolean, D>> partitioningBy(
Predicate<? super T> predicate, Collector<? super T, A, D> downstream)
  • Example

    List<Integer> demo = Arrays.asList(1, 2,3,4, 5,6);
    // Even and odd groups
    Map<Boolean, List<Integer>> map = demo.stream()
    .collect(Collectors.partitioningBy(item -> item % 2 == 0));
    System.out.println(map);
    ---------result----------
    {false=[1, 3, 5], true=[2, 4, 6]}

Aggregate to average

// return Double type
public static <T> Collector<T, ?, Double> averagingDouble(ToDoubleFunction<? super T> mapper)
// return Long type
public static <T> Collector<T, ?, Double> averagingLong(ToLongFunction<? super T> mapper)
// return Int type
public static <T> Collector<T, ?, Double> averagingInt(ToIntFunction<? super T> mapper)
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 5);
    Double data = demo.stream().collect(Collectors.averagingInt(Integer::intValue));
    System.out.println(data);
    ---------result----------
    2.6666666666666665

Stream aggregation looks for maximum and minimum values

// minimum value
public static <T> Collector<T, ?, Optional<T>> minBy(Comparator<? super T> comparator)
// Maximum
public static <T> Collector<T, ?, Optional<T>> maxBy(Comparator<? super T> comparator) 
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 5);
    Optional<Integer> min = demo.stream().collect(Collectors.minBy(Comparator.comparing(item -> item)));
    Optional<Integer> max = demo.stream().collect(Collectors.maxBy(Comparator.comparing(item -> item)));
    System.out.println(min.get()+"-"+max.get());
    ---------result----------
    1-5

Aggregate computing Statistics

  • You can get the total number of elements , Cumulative sum of elements , minimum value , Maximum , Average

    // return Int type
    public static <T> Collector<T, ?, IntSummaryStatistics> summarizingInt(
    ToIntFunction<? super T> mapper)
    // return Double type
    public static <T> Collector<T, ?, DoubleSummaryStatistics> summarizingDouble(
    ToDoubleFunction<? super T> mapper)
    // return Long type
    public static <T> Collector<T, ?, LongSummaryStatistics> summarizingLong(
    ToLongFunction<? super T> mapper) 
  • Example

    List<Integer> demo = Arrays.asList(1, 2, 5);
    IntSummaryStatistics data = demo.stream().collect(Collectors.summarizingInt(Integer::intValue));
    System.out.println(data);
    ---------result----------
    IntSummaryStatistics{count=3, sum=8, min=1, average=2.666667, max=5}

JDK12 New polymerization methods provided

// The flow goes through downstream1、downstream2 Aggregate processing , Then merge the two aggregation results
public static <T, R1, R2, R> Collector<T, ?, R> teeing(
Collector<? super T, ?, R1> downstream1,
Collector<? super T, ?, R2> downstream2,
BiFunction<? super R1, ? super R2, R> merger) 

5 Concurrent paralleStream Use

  • coordination CompletableFuture And the use of thread pools
  • Example

    public static void main(String[] args) throws Exception{
    List<Integer> demo = Stream.iterate(0, item -> item + 1)
    .limit(5)
    .collect(Collectors.toList());
    // Example 1
    Stopwatch stopwatch = Stopwatch.createStarted(Ticker.systemTicker());
    demo.stream().forEach(item -> {
    try {
    Thread.sleep(500);
    System.out.println(" Example 1-"+Thread.currentThread().getName());
    } catch (Exception e) { }
    });
    System.out.println(" Example 1-"+stopwatch.stop().elapsed(TimeUnit.MILLISECONDS));
    // Example 2, Pay attention to the need for ForkJoinPool,parallelStream Will use executor Specified thread , Otherwise, the default is used ForkJoinPool.commonPool()
    ExecutorService executor = new ForkJoinPool(10);
    stopwatch.reset(); stopwatch.start();
    CompletableFuture.runAsync(() -> demo.parallelStream().forEach(item -> {
    try {
    Thread.sleep(1000);
    System.out.println(" Example 2-" + Thread.currentThread().getName());
    } catch (Exception e) { }
    }), executor).join();
    System.out.println(" Example 2-"+stopwatch.stop().elapsed(TimeUnit.MILLISECONDS));
    // Example 3
    stopwatch.reset(); stopwatch.start();
    demo.parallelStream().forEach(item -> {
    try {
    Thread.sleep(1000);
    System.out.println(" Example 3-"+Thread.currentThread().getName());
    } catch (Exception e) { }
    });
    System.out.println(" Example 3-"+stopwatch.stop().elapsed(TimeUnit.MILLISECONDS));
    executor.shutdown();
    }
  • -------------------result--------------------------

     Example 1-main
    Example 1-main
    Example 1-main
    Example 1-main
    Example 1-main
    Example 1-2501
    Example 2-ForkJoinPool-1-worker-19
    Example 2-ForkJoinPool-1-worker-9
    Example 2-ForkJoinPool-1-worker-5
    Example 2-ForkJoinPool-1-worker-27
    Example 2-ForkJoinPool-1-worker-23
    Example 2-1004
    Example 3-main
    Example 3-ForkJoinPool.commonPool-worker-5
    Example 3-ForkJoinPool.commonPool-worker-7
    Example 3-ForkJoinPool.commonPool-worker-9
    Example 3-ForkJoinPool.commonPool-worker-3
    Example 3-1001
  • parallelStream The method does use multithreading to run , And you can specify the thread pool , However, the custom thread must be ForkJoinPool type , Otherwise, it will default to ForkJoinPool.commonPool() The thread of
版权声明
本文为[cscw]所创,转载请带上原文链接,感谢
https://javamana.com/2021/06/20210623222519713E.html

  1. redis cluster如何支持pipeline
  2. How does redis cluster support pipeline
  3. 上海 | 人英网络 | 招Java开发25-35K、React前端开发25-40K
  4. Shanghai | Renying network | recruit java development 25-35k, react front end development 25-40k
  5. SpringCloud+Docker+Jenkins+GitLab+Maven实现自动化构建与部署实战
  6. Spring cloud + docker + Jenkins + gitlab + Maven to realize automatic construction and deployment
  7. 性能工具之linux三剑客awk、grep、sed详解
  8. Performance tools of Linux three swordsmen awk, grep, sed
  9. 一次“不负责任”的 K8s 网络故障排查经验分享
  10. An "irresponsible" experience sharing of k8s network troubleshooting
  11. 性能工具之linux三剑客awk、grep、sed详解
  12. Performance tools of Linux three swordsmen awk, grep, sed
  13. 使用Spring Data JPA 访问 Mysql 数据库-配置项
  14. Accessing MySQL database with spring data JPA - configuration item
  15. 一次“不负责任”的 K8s 网络故障排查经验分享
  16. An "irresponsible" experience sharing of k8s network troubleshooting
  17. 注册中心ZooKeeper,Eureka,Consul,Nacos对比
  18. Linux最常用的指令大全!快看看你掌握了吗?
  19. Comparison of zookeeper, Eureka, consult and Nacos
  20. Linux most commonly used instruction encyclopedia! Let's see. Do you have it?
  21. Matrix architecture practice of Boshi fund's Internet open platform based on rocketmq
  22. 字节面试,我这样回答Spring中的循环依赖,拿下20k offer!
  23. Byte interview, I answer the circular dependence in spring like this, and get 20K offer!
  24. oracle 11g查看alert日志方法
  25. How to view alert log in Oracle 11g
  26. 手写Spring Config,最终一战,来瞅瞅撒!
  27. Handwritten spring config, the final battle, come and see!
  28. 用纯 JavaScript 撸一个 MVC 框架
  29. Build an MVC framework with pure JavaScript
  30. 使用springBoot实现服务端XML文件的前端界面读写
  31. Using springboot to read and write the front interface of server XML file
  32. 【Javascript + Vue】实现随机生成迷宫图片
  33. [Javascript + Vue] random generation of maze pictures
  34. 大数据入门:Hadoop伪分布式集群环境搭建教程
  35. Introduction to big data: Hadoop pseudo distributed cluster environment building tutorial
  36. 八股文骚套路之Java基础
  37. commons-collections反序列化利用链分析(3)
  38. Java foundation of eight part wensao routine
  39. Analysis of common collections deserialization utilization chain (3)
  40. dubbogo 社区负责人于雨说
  41. Yu Yu, head of dubbogo community, said
  42. dubbogo 社区负责人于雨说
  43. Yu Yu, head of dubbogo community, said
  44. 设计模式 选自《闻缺陷则喜》此书可免费下载
  45. The design pattern is selected from the book "you are happy when you hear defects", which can be downloaded free of charge
  46. xDAI被选为 Swarm 的侧链解决方案,将百倍降低 Swarm 网络Gas费
  47. L2 - 深入理解Arbitrum
  48. Xdai is selected as the side chain solution of swarm, which will reduce the gas cost of swarm network 100 times
  49. L2 - deep understanding of arbitrum
  50. Java全栈方向学习路线
  51. 设计模式学习04(Java实现)——单例模式
  52. Java full stack learning route
  53. Design pattern learning 04 (Java implementation) - singleton pattern
  54. Mybatis学习01:利用mybatis查询数据库
  55. Mybatis learning 01: using mybatis to query database
  56. Java程序员从零开始学Vue(01)- 前端发展史
  57. Java程序员从零开始学Vue(05)- 基础知识快速补充(html、css、js)
  58. Java programmers learn Vue from scratch
  59. Java programmers learn Vue from scratch (05) - quick supplement of basic knowledge (HTML, CSS, JS)
  60. 【Java并发编程实战14】构建自定义同步工具(Building-Custom-Synchronizers)