Saturday, August 29, 2020

logback-spring.xml sample

 logback-spring.xml


<configuration>


    <springProperty name="serviceName" source="spring.application.name" />

    <springProperty name="logRoot" source="logging.root" />


    <appender name="AUSPIX" class="ch.qos.logback.core.rolling.RollingFileAppender">

        <file>${logRoot}/${serviceName}-auspix.log</file>

        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">

            <fileNamePattern>${logRoot}/${serviceName}-auspix-%d{yyyy-MM-dd}.log</fileNamePattern>


        </rollingPolicy>

        <encoder>

            <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%mdc{REQUEST_UUID}] %marker %msg%n</pattern>

        </encoder>

    </appender>


    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">

        <encoder>

            <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>

        </encoder>

    </appender>


    <logger name="AUSPIX" level="info" additivity="false">

        <appender-ref ref="AUSPIX" />

    </logger>


    <root level="info">

        <appender-ref ref="STDOUT" />

        <appender-ref ref="AUSPIX" />

    </root>


</configuration>

Wednesday, August 19, 2020

Linux commands

 1. Check os version in Linux

cat /etc/os-release

2. Show display the amount of available disk space for file systems 

df -h 

3. Search in gz file

zgrep -H 'key words' *.gz


Sunday, August 16, 2020

Log4j log to MongoDB

 Log4j 2 writes log to MongoDB

1. org.springframework.data.document.mongodb.log4j.MongoLog4jAppender is deprecated and removed

2. In log4j2.xml configuration, should use MongoDb, instead of MongoDb2, MongoDB3, or may get "ERROR NoSql contains an invalid element or attribute "MongoDb2"" or "ERROR NoSql contains an invalid element or attribute "MongoDb3""

    <NoSql name="mongoAppender">

        <MongoDb databaseName="auspix" collectionName="log"

            server="localhost" port="27017" username="" password="" />

    </NoSql>    

    <Async name="mongoAppenderAsync">

        <AppenderRef ref="mongoAppender" />

    </Async>

3. In the pom.xml configuration, must provide the below dependency and the log4j-core version must be 2.9.1, same with log4j-nosql. or will get "ERROR appender NoSql has no parameter that matches element MongoDb"

    <dependency>

      <groupId>org.mongodb</groupId>

      <artifactId>mongo-java-driver</artifactId>

      </dependency>     

        <dependency>

            <groupId>org.apache.logging.log4j</groupId>

            <artifactId>log4j-core</artifactId>

            <version>2.9.1</version>

        </dependency>    

<dependency>

<groupId>org.apache.logging.log4j</groupId>

<artifactId>log4j-nosql</artifactId>

<version>2.9.1</version>

</dependency>


The project is on my githug: https://github.com/auspix/log4j2mongo

Reference:

https://logging.apache.org/log4j/log4j-2.9.1/log4j-nosql/index.html

https://logging.apache.org/log4j/log4j-2.9.1/manual/appenders.html#NoSQLAppender

https://www.codeleading.com/article/53808223/

https://yezhwi.github.io/2018/10/26/%E5%88%A9%E7%94%A8Log4j2%E5%BC%82%E6%AD%A5%E4%BF%9D%E5%AD%98%E6%97%A5%E5%BF%97%E5%88%B0MongoDB%E4%B8%AD/

Friday, August 14, 2020

The note about Java 8 Parallel Streams vs ExecutorService

 The Java 8 parallel streams look neat, compact and easy, however, it has limitations you cannot specify the thread number or thread pool.

I did a comparison using the two implementations, I have a list of stock code, and need access the URL to get the name for the code list (3470 code).

For serial retrieval, it needs 900+ seconds.

For parallel streams retrieval, it spends 138 seconds.

For ExecutorService, it spends 45 seconds for 20 threads, 18 seconds for 50 threads, and 5.75 seconds for 200 threads.

//from 900+ seconds to 138 seconds for parallelStream
List<Code> codeList = codeStringList.parallelStream().map( (code)->{
String name = null;
try {
name = SinaDataProvider.getName(code);
} catch (IOException e) {
logger.error("Fatal error to get name for "+code+","+e.getMessage());
e.printStackTrace();
}
logger.debug(code+","+name);
Code codeObj = new Code();
codeObj.setCode(code);
//remove blank space from name
codeObj.setName(name.replaceAll("\\s",""));
codeObj.setStart_time(yyyyMMdd);
codeObj.setEnd_time("");
return codeObj;
}).collect(Collectors.toList());
//45 seconds for 20 threads for ExecutorService 
//18 seconds for 50 threads
//5.75 seconds for 200 threads
ExecutorService executor = Executors.newFixedThreadPool(200);
Collection<Future<Code>> futures = new LinkedList<Future<Code>>();
for(String code:codeStringList){
futures.add(
executor.submit(()->{
String name = null;
try {
name = SinaDataProvider.getName(code);
} catch (IOException e) {
logger.error("Fatal error to get name for "+code+","+e.getMessage());
e.printStackTrace();
}
logger.debug(code+","+name);
Code codeObj = new Code();
codeObj.setCode(code);
//remove blank space from name
codeObj.setName(name.replaceAll("\\s",""));
codeObj.setStart_time(yyyyMMdd);
codeObj.setEnd_time("");
return codeObj;
}));
};
List<Code> codeList = new ArrayList<>();
for (Future<Code> future : futures) {
try {
Code codeObj = future.get();
codeList.add(codeObj);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
executor.shutdown();

Thursday, August 13, 2020

Nice email

 CEO sends a really nice email

In many ways, the pandemic has changed how we interact with one another – a plastic divider separates us from our colleagues, a mask covers our friendly smile as we greet customers, and many in-person meetings have moved online. And yet, despite this physical distance, our XX family is closer than ever. We are standing together by standing apart – and standing tall – as we demonstrate compassion and understanding in the face of hardship, each in our own way.

Monday, August 10, 2020

Mongodb Auspix note

 db.stock.getIndexes();

db.stock.createIndex({"code":1}); 

db.stock.createIndex({"time":1});

db.stock.find( { $and: [ {"code":"SH1A0001"},{"time":{$lte:"20191231 15:00"}}]} ).sort({"time":-1}).limit(1).pretty();

db.stock.distinct("code").length

db.code.find({start_time:{$ne:20200810}}).pretty()


db.stock.find({code:"SH000001"}).sort({time:1}).limit(3).pretty();


db.code.find({end_time:{$ne:""}}).pretty()

db.code.find({$and:[{code:"SH603277"},{end_time:{$ne:""}}]}).pretty()

--sort and limit, need use match

db.stock.aggregate([

{$match:{code:"SH000001"}},

    {"$sort": {"time":-1}},

    {"$limit": 5},

    {"$sort": {"time": 1}}

]).pretty()


--sort and limit

db.stock.aggregate([

{$match:{ $and: [{code:"SH000001"},{time:{$lte:"20200810 15:00"}}] }},

    {"$sort": {"time":-1}},

    {"$limit": 5},

    {"$sort": {"time": 1}}

]).pretty()

The Java implementation:

public List<Stock> getStockList(String code,String timeStr) throws JsonProcessingException {
List<Stock> stockList = new ArrayList<>();
MongoCollection<Document> stockCollection = mongoTemplate.getCollection(stockColName);
AggregateIterable<Document> stockAgg = stockCollection.aggregate(Arrays.asList(match(Filters.and(Filters.eq("code",code),Filters.lte("time",timeStr))),
sort(Sorts.descending("time")),
limit(379),
sort(Sorts.ascending("time"))));
ObjectMapper objectMapper = new ObjectMapper();
for(Document doc:stockAgg){
doc.remove("_id");
Stock stock = objectMapper.readValue(doc.toJson(),Stock.class);
stockList.add(stock);
logger.debug("stock==="+stock);
}
return stockList;
}


distinct multiple fields are pretty not straightforward and not efficient:

both below are not what I expect


db.stock.aggregate( 

[

{"$group": { "_id": { code: "$code", name: "$name" } } }

]

)


db.stock.aggregate([

{$group: {

    _id: null,

    code: {$addToSet: '$code'},

    name: {$addToSet: '$name'}

    }}

])