Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Logging in Weave is done using a third party library. Prior to Weave 2.6.7 Log4j 1.2 was used. From 2.6.7 onwards Logback is used. It is beyond the scope of this document to explain how the logging infrastructure works with Log4j and Logback so the reader is encouraged to visit the Log4j or Logback websites to find out more information about the inner workings of these libraries.

...

The principal auxiliary components in the Logback framework that play a vital role in the logging framework are: 

  • Level: The Level object , previously referred to as the Priority object, defines the granularity and priority of any logging information. Each piece of logging information is accompanied by its appropriate Level object, which tells the Logger object about the priority of the information. There are seven levels of logging defined within the API: OFFTRACE, TRACE, DEBUG, INFO, WARN , ERROR, and ALL. Each level has a unique integer valueand ERROR. The Level values can be arranged in an ascending manner as:
    ALL<TRACE<DEBUG<INFO<WARN<ERROR<OFF
    where ALL means most of the logging information will be published and OFF means none of the logging information will be published.

  • Filter: The Filter object is used to analyze logging information and make further decisions on whether that information should be logged or not. In the Logback context, Appender objects can have several Filter objects associated with them. If logging information is passed to a particular Appender object, all the Filter objects associated with that Appender need to approve the logging information before it can be published to the preferred destination attached to the Appender. Filter objects are very helpful in filtering out unwanted logging information based on any application-specific criteria.

  • Pattern: The Pattern object is specialized in providing a string representation of different objects passed to the logging framework. More precisely, when the application passes a custom Object to the logging framework, the logging framework will use the corresponding Pattern to obtain a string representation of the passed Object. This is used by Layout objects to prepare the final logging information.

  • LogManager: The LogManager object manages the logging framework. It is responsible for reading the initial configuration parameters from a system-wide configuration file or a configuration class. The LogManager stores, in a namespace hierarchy, each Logger instance created with a namespace within an application. When we try to obtain the reference named logger, the LogManager class returns the already created instance of it, or creates a new instance of the named logger, stores it in a repository for future reference, and returns the new instance to the caller applicationTRACE < DEBUG < INFO < WARN < ERROR
    Additionally when filtering log output you can specify OFF as a log level to disable all log output from a specific logger, for example a number of third party libraries used by Weave also produce log output but they produce unnecessary noise when trying to view the Weave log output, so the default Weave log configuration disables the log output from a number of these libraries by setting their specific log level to OFF (or in some cases ERROR or WARN).

  • Filter: The Filter object is used to analyze logging information and make further decisions on whether that information should be logged or not. In the Logback context, Appender objects can have several Filter objects associated with them. If logging information is passed to a particular Appender object, all the Filter objects associated with that Appender need to approve the logging information before it can be published to the preferred destination attached to the Appender. Filter objects are very helpful in filtering out unwanted logging information based on any application-specific criteria.

  • Pattern: The Pattern object is specialized in providing a string representation of different objects passed to the logging framework. More precisely, when the application passes a custom Object to the logging framework, the logging framework will use the corresponding Pattern to obtain a string representation of the passed Object. This is used by Layout objects to prepare the final logging information.

Configuring Logging

The configuration below defines the level of the root logger as DEBUG and specifies the appender to use as testAppender. Next, the appender testAppender is defined as referencing the ch.qos.logback.core.ConsoleAppender object and the layout of the appender as the pattern is specified. A pattern requires that a conversion pattern or a format be supplied. In this case the conversion pattern of %m%n is supplied, which means the logging message will be printed followed by a newline character, more information about these patterns are available at https://logback.qos.ch/manual/layouts.html#conversionWord

Code Block
languagexml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>

	<appender name="testAppender" class="ch.qos.logback.core.ConsoleAppender">
		<encoder>
			<pattern>%message%n</pattern>
		</encoder>
	</appender>

	<root level="DEBUG">
		<appender-ref ref="testAppender"/>
	</root>

</configuration>

A more complex configuration can attach multiple appenders to a particular logger. Each appender, in turn, can have a different layout, and that layout can have a conversion pattern associated with it.

Below is an example of a more complex configuration file. This configuration file defines the root logger as having level DEBUG and attaches two appenders named CONSOLE console and FILE file to it.

Code Block
languagexml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>

	<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
		<encoder>
			<pattern>%message%n</pattern>
		</encoder>
	</appender>

	<appender name="file" class="ch.qos.logback.core.rolling.RollingFileAppender">
		<file>logs/weave.log</file>
		<encoder>
			<pattern>%date{yyyy-MM-dd HH:mm:ss.SSS} %level [%thread] %logger: "%message"%n</pattern>
		</encoder>
		<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
			<fileNamePattern>logs/weave-%d{yyyy-MM-dd}.log</fileNamePattern>
			<maxHistory>10</maxHistory>
		</rollingPolicy>
	</appender>
	<root level="DEBUG">
		<appender-ref ref="console"/>
		<appender-ref ref="file"/>
	</root>
</configuration>
	</appender>

	<root level="DEBUG">
		<appender-ref ref="console"/>
		<appender-ref ref="file"/>
	</root>

</configuration>

The default logging.xml file included with Weave is similar to the above example but has some additional configuration that enables the different types of log output depending upon how Weave was started, for example if you use startup.cmd to start Weave you only get the console output, but if you started Weave as a service then you only get the weave.log file output. This is done by including conditions around the appender-ref tags in the root section of the file, but if you want to explicitly determine which log output is generated you can remove those additional condition tags and have the same output regardless of how Weave was started.

Note that the default log level set in logging.xml this does not effect the log information recorded in a support dump, a support dump generated from Weave will now always include all log messages generated during startup of Weave and all log messages generated (up to a certain number) before the dump was produced, regardless of the log level set in logging.xml.

Reading Log Files

When Weave has been installed without any modifications to the logging.xml file, Weave logs the following information to the weave.log file within the $WEAVE_HOME/logs/ directory when started as a service/daemon. The weave.log file contains the current log messages that Weave has produced in the following format.

TIME LOGGING_LEVEL THREAD USER_ID SESIONSESSION_ID MESSAGE_ORIGIN MESSAGE

...

Code Block
00:06:45,405 DEBUG [OSGi Console]  com.cohga.search.indexer.internal.Activator "Stopping Weave Index ..."

Item

Value

Description

TIME

00:06:45,405

States the time the message was logged to the file.  Format is HOURS:MINUTES:SECONDS,MILLISECONDS

LOGGING_LEVEL

DEBUG

The Level of the message logged. Possible values are DEBUG, WARN, INFO or ERROR

THREAD

[OSGi Console]

The thread that the message was logged in (See below for more information about reading logs).  

MESSAGE_ORIGIN

com.cohga.search.indexer.internal.Activator

The class that the message was logged against. e.g. The class Activator in the package com.cohga.search.indexer.internal

MESSAGE

"Stopping Weave Index ..."

The message that is being set to the log.  Typically this is in human readable form, however sometimes the messages may not be that intuitive to the user.

The main items that change, and that should be noted are the Thread, Message Origin and the Message.  These three items define when and where the message was logged from.  The Thread component can cause some confusion to many first time readers if they have not familiar with multithreaded applications.  A key example is that the same message may be logged three times in succession. This happens due the the WebService, that Weave runs under, which is handling requests from multiple different locations at the same time.  Each thread is being executed concurrently and working in isolation within the Weave application.  An example of what you would see in the log file for this is:

Code Block
10:09:54,304 DEBUG [btpool0-0]  com.cohga.server.processor.script.ScriptProcessor "Finished script execution (in 91ms)"
10:09:54,305 DEBUG [btpool0-0]  com.cohga.server.processor.script.ScriptProcessor "Starting sending script result"
10:09:54,305 DEBUG [btpool0-0]  com.cohga.server.processor.script.ScriptProcessor "Finished sending script result (in 0ms)"
10:09:54,305 DEBUG [btpool0-0]  com.cohga.server.processor.script.ScriptRequestProcessor "Finished script request processing (in 178ms)"
10:09:54,306 DEBUG [btpool0-4]  com.cohga.server.processor.script.Activator "JSON canonical: {"map":{"extent":{"crs":"EPSG:3111","...}"
10:09:54,315 DEBUG [btpool0-7]  com.cohga.server.map.wms.WmsMapEngine "Wrote image/png image in 30ms"
10:09:54,315 DEBUG [btpool0-7]  com.cohga.server.processor.script.ScriptProcessor "Finished script execution (in 34ms)"
10:09:54,315 DEBUG [btpool0-7]  com.cohga.server.processor.script.ScriptProcessor "Starting sending script result"
10:09:54,316 DEBUG [btpool0-7]  com.cohga.server.processor.script.ScriptProcessor "Finished sending script result (in 1ms)"
10:09:54,316 DEBUG [btpool0-7]  com.cohga.server.processor.script.ScriptRequestProcessor "Finished script request processing (in 183ms)"
10:09:54,316 DEBUG [btpool0-4]  com.cohga.server.processor.script.ScriptProcessor "Staring script execution"
10:09:54,321 DEBUG [btpool0-4]  com.cohga.server.map.base.BaseMapEngine "Checking scale ranges against 20133.7824"
10:09:54,323 DEBUG [btpool0-4]  com.cohga.server.map.wms.WmsMapEngine "Wrote image/png image in 2ms"
10:09:54,324 DEBUG [btpool0-4]  com.cohga.server.processor.script.ScriptProcessor "Finished script execution (in 8ms)"
10:09:54,324 DEBUG [btpool0-4]  com.cohga.server.processor.script.ScriptProcessor "Starting sending script result"
10:09:54,324 DEBUG [btpool0-4]  com.cohga.server.processor.script.ScriptProcessor "Finished sending script result (in 0ms)"
10:09:54,324 DEBUG [btpool0-4]  com.cohga.server.processor.script.ScriptRequestProcessor "Finished script request processing (in 179ms)"
10:09:54,325 DEBUG [btpool0-9]  com.cohga.server.map.base.BaseMapEngine "Checking scale ranges against 20133.7824"
10:09:54,332 DEBUG [btpool0-9]  com.cohga.server.map.wms.WmsMapEngine "Wrote image/png image in 6ms"
10:09:54,332 DEBUG [btpool0-9]  com.cohga.server.processor.script.ScriptProcessor "Finished script execution (in 39ms)"
10:09:54,332 DEBUG [btpool0-9]  com.cohga.server.processor.script.ScriptProcessor "Starting sending script result"
10:09:54,333 DEBUG [btpool0-9]  com.cohga.server.processor.script.ScriptProcessor "Finished sending script result (in 1ms)"
10:09:54,333 DEBUG [btpool0-9]  com.cohga.server.processor.script.ScriptRequestProcessor "Finished script request processing (in 241ms)"
Info

The above example illustrates that there were four threads being executed. btpool0-0,btpool0-4,btpool0-7 and btpool0-9. it shows that btpool0-4 has added a logging message between a message logged by btpool0-0 and btpool0-7.  When looking for a particular sequence messages that have been logged by a particular user, make sure that the thread name is the same throughout the request.

Common Issues

Some of the underlying libraries used within Weave also produce logging information. Each library determines what it sees as being relevant to log at various debugging levels (i.e. ERROR, INFO, etc). These messages can sometimes confuse the reader about where and when an error has occurred. Also, an error message may be in server parts, this is called a nested exception. Typically the exact error location and message will be the first in 

...

Note

When reading a log file you will typically work your way from the bottom to the top.  If you do and you encounter an error, please go to the first location in the log file that it occurs.  It will generally be the most detailed.

Code Block

...

03:40:32,622 ERROR  [PushToDatasource-0]  com.cohga.server.stats.export.internal.ExportEvents "There was an exception during the rollback."
java.sql.SQLException: Closed Connection
	at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
	at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110)
	at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171)
	at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:227)
	at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:439)
	at oracle.jdbc.driver.PhysicalConnection.rollback(PhysicalConnection.java:3369)
	at com.cohga.server.datasource.jdbc.internal.PooledJdbcDataSource$PooledConnection.rollback(PooledJdbcDataSource.java:468)
	at com.cohga.server.stats.export.internal.ExportEvents.doAnExport(ExportEvents.java:118)
	at com.cohga.server.stats.export.internal.ExportEvents$1.run(ExportEvents.java:46)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:619)
03:40:32,622 WARN  [PushToDatasource-0]  com.cohga.server.stats.export.internal.ExportEvents "Failed to push events to Datasource sqlsvr1"
java.lang.RuntimeException: java.sql.SQLException: Closed Connection
	at com.cohga.server.stats.export.internal.ExportEvents.doAnExport(ExportEvents.java:125)
	at com.cohga.server.stats.export.internal.ExportEvents$1.run(ExportEvents.java:46)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:619)
Caused by: java.sql.SQLException: Closed Connection
	at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:70)
	at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110)
	at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171)
	at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:227)
	at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:439)
	at oracle.jdbc.driver.PhysicalConnection.prepareStatement(PhysicalConnection.java:3046)
	at oracle.jdbc.driver.PhysicalConnection.prepareStatement(PhysicalConnection.java:2961)
	at com.cohga.server.datasource.jdbc.internal.PooledJdbcDataSource$PooledConnection.prepareStatement(PooledJdbcDataSource.java:458)
	at com.cohga.server.stats.export.internal.ExportEvents.doAnExport(ExportEvents.java:101)
	... 7 more

...