Log Files

Log files capture information about system resources, performance, errors, and quotas.

B2C Commerce includes several different types of log files, including system logs, custom logs, import and export logs, a quota log, security logs, and batch processing logs.

To see system and custom application log files, select Administration > Site Development > Development Setup. In the WebDAV Access section, click the Log files links. The Log files home page (Index of /) shows a list of log files in alphabetical order. With the right permission, you can also access the log files via a WebDAV client.

There are two types of log files:

  • System logging: System information logging is turned on by default. Ask Salesforce Support to change your configuration if you want to change your system's logging configuration.
  • Custom logging: If you perform programmatic logging using script files, you can configure custom logging in Business Manager. Custom logging is turned on for certain types of messages. You can configure it for the other types of messages. Custom debug and info logs are disabled by default. Enabled or disable them for a specific instance at Administration > Operations > Custom Log Settings. If custom logging is disabled, no messages are logged. Custom log files begin with custom.

Production and Staging instance log files are captured daily, stored for 30 days, and then automatically deleted. The only exception is security log files, which are automatically deleted after 90 days.

Log retention only applies to Production and Staging instances. Development and sandbox instances are not supported.

You can delete log files more frequently using WebDAV. After three days, the log files are moved into a log_archive directory and compressed into gzip files. If you want to retain log files for more than their retention time, download the files and store them locally.

Allowing files to grow indefinitely can cause performance and storage issues, so files are restricted by size. Each application server can write up to 10 MB per day into the customdebug, custominfo, customfatal, customerror, and customwarn log files. The same limit applies to the number of fatal, error, warn, debug, and info messages that can be stored in custom named log files.

To ensure optimal performance and stability, delete unnecessary log files as soon as possible. Don’t store more than 100,000 files in a single folder. If there are more than 100,000 files in a single folder, files are deleted starting with the oldest first even if the files are newer than the retention period for the folder. The 100,000 limit applies to files in one specific folder, not across subfolders.

A day is 00:00 to 24:00 (12:00 a.m. to midnight next day) in Greenwich Mean Time (GMT). Logging is suspended until the next day (00:00) once the 10 MB limit has been reached. If the maximum possible log size is almost reached, for example, the log file size is 9.9 MB, and the next log message will exceed the limit, the log message is still written, but only up to an additional 100 KB beyond the 100 MB. This lets you use the log content's full permitted amount. When the log file has reached its maximum size for the day, the following info message is written to the log file:

System log settings have no storage limit.

One log file exists per log level, per application server. Log levels for system and customization include:

  • info
  • error
  • warn
  • debug
  • fatal

The debug log level is always disabled on production instances and enabled for in memory logging on non-production instances. In memory logging allows for the Show Request Log feature, but doesn't generate log files.

This table describes how different log files work.

Log fileHow do I activate/configure this?How long is it retained by time or file size?Purpose
dbinit-sqlNot configurable30 daysSQL used during dbinit.
deprecationNot configurable30 daysShows usage of deprecated APIs: date and time, API name, number of times used. If the file is empty, no deprecated APIs were used.
analyticsNot configurable30 daysShows any activity with B2C Commerce analytics. Review these for messages for active merchandising reports or any errors.
apiNot configurable30 daysShows API problems: area (Pipeline Dictionary, Pipeline) date and time, problem type (Error, Info, Template, Warn), path, detail type (Key, Pipelet, Pipeline, Site) and problem details. Review these for script usage and violation messages.
customdebug

Administration > Operations > Custom Log Settings

Logger.debug in script

10 megabytes after clicking the Log To File button

Repeatable

30 days

Review this for debug messages in custom jobs, imports, payment or code that could impact users.
customerror

Administration > Operations > Custom Log Settings

Logger.error in script

10 megabytes after clicking the Log To File button

Repeatable

30 days

Review this for errors in custom jobs, imports, payment or code that could impact users.
customfatal

Administration > Operations > Custom Log Settings

Receive Email: Enter a comma-separated list of valid email addresses to enable email notification of Fatal messages. Email notifications are sent once every minute.

10 megabytes after clicking the Log To File button

Repeatable

30 days
Review this for fatal errors in custom jobs, imports, payment or code.
custominfo

Administration > Operations > Custom Log Settings

Logger.info in script

10 megabytes after clicking the Log To File button

Repeatable

30 days

Review this for informational logs in in custom jobs, imports, payment or code.
customwarn

Administration > Operations/ > Custom Log Settings

Logger.warn in script

Always logged to file.

Repeatable

30 days

Review this for warn messages in custom jobs, imports, payment or code that could impact users.
debugNot configurable30 daysShows debug information for the entire site if the Debug flag is enabled. Use this with SOAP testing.
errorNot configurable30 daysShows errors in B2C Commerce scripts, templates, code and other areas.
fatalNot configurable30 daysShows fatal errors in B2C Commerce scripts, templates, code and other areas.
infoNot configurable30 daysShows information logs reported on B2C Commerce scripts, templates, code and other areas.
jobsNot configurable30 daysShows job status information: date and time, area, status (for example, Job Manager Stopped) on all Salesforce B2C Commerce instances and custom jobs.
migrationNot configurable30 daysInternal migration data.
performanceNot configurable30 daysInternal performance data. Not available in PIG instances.
quotaNot configurable30 days

Contains B2C Commerce quota details such as: object quotas, object relation quotas and API quotas. Select also Administration > Operations > Quota Status to view status (in addition to logs).

sqlNot configurable30 daysReview these if you have replication issues.
stagingReplication info during replication. Not configurable.30 daysInformation on B2C Commerce data and code replication processes (only on production, staging and development instances).
syseventNot configurable30 daysShows Appserver registration and cartridge-related logs. Shows date and time, event (for example, Active code version set to mobile).
syslogNot configurable30 daysShows information that is related to API processing, data staging, and error handling. Also shows import and export related task info, host information, and code version activation. These logs might provide useful information when troubleshooting import and export related issues.
consoleNot configurable30 days

Internal console log entries. Not available in PIG instances.

The name of the console log file follows this pattern:

console-hostname-appserver_id-server_name-timestamp.log

In this pattern, timestamp is in yyyyMMdd format.

Each entry contains: date and time, path, command (init, load, start, log), and message type (Error, Info, Severe, Warning).

warnAdministration > Operations > Custom Log Settings30 daysShows lock status reports (good for job info), slot warnings and warnings for servlets. Shows date and time, WARN, area (RequestHandlerServlet, Rendering, server|application name, message details.

Error and Warning messages are tracked for redundancy. If a message appears more than 10 times in 3 minutes, it is suppressed if it reappears within the next 3 minutes. When suppression begins, the following text precedes the log message text within the log entry:

Tracking of repeated messages continues while the suppression is in place. After the suppression period has ended, if a new log message appears whose occurrence rate is over the 10 message per 180-second threshold, it is logged with the message described. A new suppression period begins.

The method public static Log getLogger( String fileNamePrefix, String category ) enables you to obtain a log instance and write into a custom log file through that log instance. When you call the method with the given file name prefix the first time, B2C Commerce creates a log instance and a corresponding log file in the usual log directory. When you call the method with the same file name prefix a second time, B2C Commerce returns the same log instance as the first call, and all log messages are stored in the same log file.

The log messages are written with the same pattern as used in system and custom logs, as follows:

This is an example:

The file name prefix must follow these rules:

  • Cannot be null or an empty string
  • Must be at least three characters long
  • Can contain characters a-z A-Z 0–9 '-' '_' only
  • Can have up to 25 characters
  • Cannot start or end with '-' or '_'
  • Can only start or end with a-z A-Z 0–9

The log categories provided through Business Manager for the custom script log are also applied to this type of logger.

Governance allows only 200 different log file names (therefore, 200-log instances and log files) per day. This quota is a hard limit and is counted per appserver.

The quota log contains B2C Commerce quota details such as:

  • System object (data) quota status and limits
  • System API quotas status and limits
  • Sandbox system's database status
  • Name of deprecated methods and how often they were used within the last 24 hours

Security log files are located at https://<instance name>.demandware.net/on/demandware.servlet/webdav/Sites/Securitylogs.

Security log entries can look like this:

[2015-10-28 02:23:19.139 GMT] [DW-SEC] (User: 'username' (Sites), IP: 100.100.10.100 [LOGIN] : logged in.)

The security log also includes:

  • The session ID to the log entry for Business Manager logins
  • Log entries for re-logins (when the Business Manager requires the password due to inactivity). These log entries log both the old and the new session ID.

Security log files are automatically deleted after 90 days. Users and clients can’t delete security logs, or turn off security logging. If you want to retain log files longer than 90 days, you must download the files and store them locally or in a dedicated storage.

You can find import and export specific information in the import and export log files. In Business Manager, navigate to Administration > Site Development > Development Setup. In the WebDAV Access section, click the Import/Export link to view log details on any activity relative to import and export. This information can provide you with critical insight into job failures or data errors. You can use this link or access the data via a WebDAV client when you perform import and export functions via Business Manager or programmatically via a pipelet. Import and export can be performed on many types of data.

Analyze the data in these logs as part of daily operations and each time there is an issue with a related process such as catalog import, price book or custom feed.

The Import/Export directory has the following structure on all instances.

This directory contains logs for specific processes such as catalog, customer, or product batch processing, catalog import and export, inventory import, and price book import. It also contains validation logs for Price Book, Promotions, Metadata, Coupon, Inventory, and Catalog.

processlogs_archive/ Archive of logs

This directory contains import and export source files, which include processes such as catalog import and custom jobs for feeds such as affiliates and channel advisor.

The following is an example of a subdirectory structure for src/:

Files in the /Impex folder and files in the /Impex subfolders are automatically deleted after 30 days. The subfolders are saved.

You can perform batch processing on catalogs, customers, and products. Product/catalog batch processing and customer batch processing are not combined into the same interface. View the results of either from their respective Business Manager sections. For example, you can view customer batch processes by selecting site > Merchant Tools > Customers > Batch Processing, but not by selecting site > Merchant Tools > Products and Catalogs > Batch Processing.

Batch processing logs can contain error reports as a result of a failed batch process. A log file can contain WARN, ERROR, or FATAL log messages. You can view the logs from the specific Batch Processing page, or from the Import/Export log directory, for example:

Access log files from Business Manager or a WebDAV client.

  1. To access log files from Business Manager, select Administration > Site Development > Development Setup and scroll to the WebDAV Acess section, where you can click on links to see the logs.

  2. To access log files from a WebDAV client:

    1. Set up an SSL connection in Total Commander, using a valid Business Manager user name and password with proper access to connect to the Logs directory.

      You can use any WebDAV client but we recommendTotal Commander. Make sure to use the Total Commander WebDAV client and not the FTP client, as sFTP is not supported.

    2. Use the following URL to access the logs:

      https://{\<instance name>}/on/demandware.servlet/webdav/Sites/Logs

      When connected, you can browse logs, search for error strings or download to your environment.

Quota log files aggregate quota messages over a set period. Messages are written when the warn threshold or limit of a quota is exceeded.

The code location of the most recent event is written to the log file. The code locations of the aggregated events aren’t exposed. Quota log files show information about the location where a quota-warn threshold or limit was exceeded. The information includes the site name, job name, and top pipeline name. Additional information is sometimes included about the executed pipelet, script, pipeline node, and template.

Quota log files use this general format:

  • a timestamp
  • the thread name (contains type of request, site, pipeline name, and other parameters)
  • the actual message

This format produces log entries such as the following:

This section of the entry is the timestamp:

This section of the entry is the internal reference:

This section of the entry is the message:

The message contains details to help identify problems. You can see:

  • If a quota is enforced or not enforced
  • If applicable, the warn threshold
  • The limit
  • The number of times since the last log entry for this quota that the warn threshold or limit was exceeded
  • The highest observed reading (max actual)
  • The current location, which shows where the warn threshold or limit was exceeded

Read-only quotas don't have a warn threshold or limit, for example, Quota object.ProductPO.readonly@SF (enforced).