SonarC Application

To open the SonarC GUI navigate your browser to (assuming you used default ports in the installation)

https://<your SonarC hostname or ip>:8443/sonarc.xhtml
or
https://<your SonarC hostname or ip>:8443
or
http://<your SonarC hostname or ip>:8080

The SonarC GUI has four sections:

  • A discovery GUI allowing you to search through the data using any set of conditions.
  • JSON Studio - a GUI that allows you to ran arbitrarily sophisticated queries and analytics as well as build powerful visualizations using the Guardium data.
  • A set of predefined reports that you can use to view security data. You can also distribute these forms to end-users who can run reports themselves in a self-service approach.

Users

SonarC users are created and managed by the SonarC administrator using the SonarC User/ Role Management Console. Using this console you can view users and roles, add roles to existing users, manage passwords, and create new users. All such console activity is audited and passwords need to conform to password complexity configuration.

Password complexity configuration is kept in web.xml using the following parameters:

  • sonarg.pwdExpiration - interval in months requiring a password change (default is 2)
  • sonarg.pwdComplexity - required composition of password (default requires an upper, a lower, a digit and a special character)
  • sonarg.pwdLength - minimal length (default is 8)

Users and password may also be created using the administrator shell (cli). After logging in using the admin account create the user using:

$ mongo <sonarc host>:27117/admin -uadmin -p
> db.createUser({
   user: "janedoe@jsonar.com",
   pwd: "s6p3rs3cr3t",
   roles : [ "sonarGUser" ]
   })

Users can change passwords from the Studio (GUI) using the “Connected as” drop down or using the CLI/command shell:

$ mongo <sonarc host>:27117/admin -u'janedoe@jsonar.com' -p
> db.changeUserPassword("janedoe@jsonar.com", "jsonar");

Note that it is not mandatory to use an email address as the username but it is recommended. Functionality such as sign-on and notification relies on the username being an email address and if your user is not an email address you will not be able to review and sign-off on your reports.

Prebuilt Reports and Dashboards

SonarC includes a set of prebuilt reports and dashboards that you can use as-is or as starting points for customized reports. The prebuilt reports can be viewed from the Predefined drop-down menu or from the Admin Tasks drop-down menu. Almost all reports can be viewed as an HTML report or downloaded as a CSV. Most report require a from-to date range many predefined data reports can also be run for the past 24 hours to avoid having to choose the date range. The current set includes:

Review Reports

  • User Involvement Justifications: Report showing current Justify workflow events involving the logged-in user.
  • Pending Justifications: Report showing currently open Justify workflow events assigned to the user or one of the roles of the user.
  • User Review / Sign-Off: (pre v3) To-do list requiring sign-off when not using the Justify applications (allows signing and commenting at a report level). As of v3.1 these can and should be signed-off within the Justify application.

User Reports

  • Users: A SonarC administrator can view the list of defined users and their roles.
  • Admin Review/Sign: A SonarC administrator can view all reports / sign-off requests for all users and sign on their behalf.

Incoming Files Reports

  • File Ingest: Shows a list of ingested file by collector / type / time.
  • Processed Ingest: Shows a pivot table of ingested files for all collectors for the last 7 days. Each type shows how many files and for how many different hours data files exist.
  • Files Pivot: Shows a pivot table of files on disk for 7 days back from the selected time (regardless of ingestion). Directories include incoming, archive and audit or a .* for all. Complete is either 1 for “complete” files, 0 for data files or 2 for both.
  • Daily Incoming Data Over: Shows collectors that have sent in aggregate over X MBs of (uncompressed) data in a single day (looks only at the last 7 days). You can use this to set an email alert to notify you when a collector suddenly sends over 3GB (to catch a policy bug quickly before collectors’ disks fill up)
  • File Sets: Shows potential set gaps (among session, session end, exception and full_sql).
  • Expected Files: Shows a pivot with the last 7 days where each day shows which file sets may be incomplete (given a starting set).
  • Expected Filed (Detailed): Similar to the above but a detailed hourly breakdown.

Job Reports

  • Scheduled: Shows the defined schedules; date filter is when the job was first scheduled
  • Dispatcher: Shows what jobs were fired and what their current status is
  • Processed: Shows more details on jobs fired and processed

Workflow Reports

  • All Justifications: Shows all Justify events and transitions based on a variety of filters.
  • Pending Justifications: Shows open (pending) justifications, who they are assigned to, and what status they’re in.
  • Definitions: Report showing the workflow transition definitions.
  • Admin Review / Sign: (pre v3) To-do list allowing an admin to sign-off on behalf of a user when not using the Justify applications (allows signing and commenting at a report level). As of v3.1 these can and should be signed-off within the Justify application.

Run Times

  • Report: Shows run times of all running reports and queries.
  • Graph: Similarly in a graph format.

Reduction for Reports

Reports often include very detailed information. While there are many reports that provide aggregated information too, SonarC allows you to take any report (prebuilt or your own) and reduce it by any number of dimensions and view the data as you would like to. Click on the Reduce button, then select which fields to aggregate by, which fields to omit, which fields to take a representative value (one of the values) and which fields to combine into a list and click Reduce Now. You will get a report with the reduced data set. You can do multiple reductions by further reducing the reduced data set using a field subset. You can save, load and share reducer definitions.

Using JSON Studio and Using the Report Builder

The most robust tool in the system is JSON Studio (click the Analyze button). Using JSON Studio you can build any pipeline, analyze any data and create any report, view or dashboard. For full documentation of the tool see http://jsonstudio.jsonar.com. However, with great power comes a learning curve and JSON Studio is a tool better used by advanced admins and developers.

For adding simple reports to the Predefined Reports section on SonarC’s home page use the Report Builder (using the Add Report button). Using this tool select a collection (data set). The metadata of the collection will be displayed showing you what fields exist, what types the data is, and what the frequencies are (if the screen comes up blank click the Rebuild Metadata button; this may take a few seconds but will rebuild the metadata for all collections).

You can then begin to build your report. You select which fields should be displayed by the report by entering sequencial numbers (starting with 1) in the Field Order in Report column. A zero means that the field will not be on the report as will an empty selection. You can also specify the field heading used in the report in the Field Header in Report column. Leaving the later empty will use the field name for the heading.

Each report has filtering criteria. To add a field for searching on, enter a sequence in the Order in Submit Form column. Only strings numbers and dates can be searched on. When you add a date or numeric field the form will have a from-to set of variables. When you add a string field you can determine whether the search is based on a regular expression, a perfect match or word containment; the default is regex.

Each user can only see/edit their own report definitions. When you are done building the report save it and add it to the custom menu for yourself or for all users and refresh the home page for it to take effect. You can also use the Preview button to test your new report.

Scheduling and Dispatching

Queries defined within the SonarC UI can be scheduled for delivery using the scheduler. Similarly, predefined reports can be scheduled directly on the predefined report filter screen and JSON Studio queries can be scheduled.

A scheduled job is defined by selecting what the job contents will be (i.e. which query/report), what schedule to use, and how to deliver the results. The schedule is defined as a CRON string and delivery can be via email by SCP-ing the results to a host/directory or as events sent over syslog. SMTP, syslog and SCP targets need to be configured in the dispatcher.conf file for delivery to occur.

Configuration processes differ slightly when scheduling a query from the SonarC GUI and from JSON Studio vs. scheduling a predefined report. From the SonarC GUI or JSON Studio, once you have a saved query that you want to schedule click on the Publish URL button. You do not need to enter your credentials but you enter any bind variables needed unless you plan to supply a bind variable collection (more on that in the next section). Click on the Compute URL button and validate your URL then click on the Schedule Job button and select the initial day you want the query to run or pick an existing scheduled job if you already have one that you wish to edit. Enter all the scheduling parameters including the CRON string and save. Note that the time slot displayed on the calendar will be computed from the cron string and will be displayed as the next fire time regardless of which day you initially clicked on. The CRON schedule will be displayed as UTC time adjusted to the timezone of your browser. The schedule will fire at the time of the SonarC server. For example, if you select a cron string of 0 30 23 then the scheduler will fire at 11:30pm of the SonarC machine. The slot used in your calendar will be 11:30pm UTC adjusted to your browser’s timezone. When you save the job definition the job is persisted and the schedule created.

To schedule a predefined report navigate to the Predefined Views page and click the button for the appropriate report you want to schedule. This opens the filter page for the report. Enter your credentials and click on the scheduler button. Select the scheduled time and job attributes and then click on the “Schedule” button in the form. This will validate the job and schedule it.

Very often you want to schedule a report that has a “from” and a “to” variable (e.g. to determine which activities should be included). Because you might want this report to run every day or every week or every month you should not hard-code the from/to times.

Instead, use the following syntax to ensure that the from/to time frames are generated when the job schedule fires. Build the query using a bind variable (e.g. called $$from and/or $$to) and then when defining the schedule on the API screen enter one of the following strings as bound to the from and/or to:

  • “$$LMRM_NOW” - will be the time when the scheduler fires.
  • “$$LMRM_NOW-4DAYS” - will be 4 days prior to when the scheduler fires.
  • “$$LMRM_NOW-7HOURS” - will be 7 days prior to when the scheduler fires.

You can use any number and you can use a + or a - sign. If you want 1 week use 7DAYS and if you want a month use 31DAYS to ensure that all activity will be included even on longer months. Do not include spaces and remember that the bind variable must be within quotes.

As an example, if you want to schedule a report that sends all sessions received in the past 6 hours use a predefined report for session and enter “$$LMRM_NOW-6HOURS” (with the double quotes) in the from field and “$$LMRM_NOW” (with the double quotes) in the to field.

As another example, your custom query might have a variable called “$$from”. In the Bind Variables section you can map from to “$$LMRM_NOW-1DAYS”. When you save (and publish!) your query, click the Publish URL button. Fill in the bind var mapping, as an example, as shown below. The click on Publish URL. Then click on Schedule Job and enter the scheduling information.

_images/bind.jpg

Common pitfalls:

  1. Using DAY instead of DAYS.
  2. Using __ instead of _
  3. Forgetting the $$
  4. Defining the LMRM_NOW in the query rather than as the bind variable mapping
  5. Forgetting the quotes

When delivering results by email you can choose between three options:

  • Notification - you will receive an email that results are ready and will need to login to SonarC using the Review/Sign button on the predefined reports page.
  • Link - you will receive an email with a link. Click on the link to see the results. If you also want to sign the results login to SonarC using the Review/Sign button on the predefined reports page.
  • Attachment - you will receive an email with an attachment (the PDF or CSV file). Click on the link to see the results. If you also want to sign the results login to SonarC using the Review/Sign button on the predefined reports page. Note that the dispatcher has a configurable limit for attachment size and attachments large than this size will be omitted from the email.

Scheduled jobs are fired by the Tomcat server hosting the SonarC applications. Note that if the Tomcat server is down and a schedule passes, it will not be fired when the server re-starts. Scheduled jobs are only fired when the scheduling Tomcat server is up.

When a schedule fires the scheduler adds a directive for the dispatcher. The dispatcher runs the report and creates the CSV or the PDF. These are then copied or sent based on the job definition. A deliverable may be sent to multiple users - each receiving a separate email. Note that you can deliver results by email even to people who do not have a user defined in the system - they just are not able to sign receipt. For users to be able to receive and sign a result they need to be defined as a user in the system and their username needs to be their email address.

There are two ways to deliver results to different users:

1. You can upload a spreadsheet to define multiple runs, each with a different set of values assigned to a different email address. You then specify the collection you are using in the “Parameter collection for bind variables” field. Per line in this collection the dispatcher runs the report (each time with a different set of bund values) and emails the result to the specified user.

2. If you structure your pipeline/query/report in a way that one of the fields has an email address you can enter this field into “Field used for email target”. The dispatcher will auto-split the results into multiple reports and send each person only the lines that “has their name on it”.

Users can sign receipt of the deliverables from the Review/Sign button on the predefined reports page. Admins can sign any other users’ reports’ deliverables on their behalf (for example when they are not available). The user who did the signing is recorded and can be viewed on the same report where the signing is done. Each deliverable can only be signed once.

Jobs create either PDFs, CSVs or both and when selecting BOTH, both will be delivered by email.

All CSVs and PDFs generated by the dispatcher as part of these scheduled jobs can be archived by the dispatcher. When configured, deliverables created per day are packaged into a single compressed and encrypted archive file and copied from the SonarC host to a SCP target. Each one of these archive files also includes the sign-offs that occurred during that day as a CSV report. PDFs and CSVs are purged from the SonarC based on a configured retention period (e.g. all CSVs and PDFs older than 30 days are purged). All these configurations are done by the Sonar admin by modifying dispatcher.conf in the sonarFinder install directory.

When sending email notifications the dispatcher has two modes. In the default mode each scheduled job produces a CSV or PDF and causes a separate email to be sent to the users. If you have multiple jobs that fire at the same time producing different deliverables (e.g. different tasks in a classic audit process) all deliverables can be combined into one email with a merged PDF or a single zip of CSVs. To enable aggregation of results add the following to the [dispatch] section of dispatcher.conf and restart the dispatcher service:

bulk_email_report = false

Limits:

By default CSVs and PDFs will not contain more than 100M rows. Bypassing this limit is possible using the following command performed in the lmrm__sonarg database:

db.lmrm__preferences_override.update(
   {app_name: "JsonStudio", username: "*"},
   {$set: {apiJSONReturnLimit: NumberInt(<new limit value>)}})

Note that generating very large reports takes time. The query work is very fast but the generation of the report itself can be slow (reading the query results, converting to a CSV and then converting into a PDF if required). CSV generation works at a rate of approximately 100K lines per minute and can vary with hardware. Here are example times for an extreme (and unrealistic) report generating a CSV and then converting it to a PDF for a query yielding 2.7 Million rows on an AWS m4.4xlarge machine (the PDF has over 130,000 pages). It takes 120 seconds for the warehouse to compute the result set. Generating the CSV takes an additional 20 minutes with a CSV file size of size of 355MB. Generating the PDF takes an additional 3 hours with a PDF file size of 417MB. While lengthy, generating the CSVs and PDFs does not place heavy processing load on the servers. The PDF generation itself can take a very large amount of memory so prefer not generating PDFs for hundreds of thousands of lines or ensure that you have enough virtual memory. CSVs are more manageable and more efficient for very large data sets.

Using Bind Variables in Scheduled Jobs

Queries and reports generated by the scheduler/dispatcher support all bind variable features of SonarC and of JSON Studio. However, when the dispatcher generates results it must have access to all the values for all bind variables or results cannot be generated.

There are three types of bindings that can occur:

  • When you define the schedule you can specify the bind values when you generate the URL.

  • If you use a built-in time-based bind value the dispatcher will fill the value in. This is useful for example when you run a report on a periodic basis (e.g. once a day, once a week, once a month, etc) and each run should look at the previous timeframe. Use a built-in bind variable via one of the following syntax’s (as described in the previous section):

    $$LMRM_NOW
    or
    $$LMRM_NOW-<some num>DAYS|HOURS (or +)
    
  • Use custom binding - this is the most flexible method and allows you to generate arbitrary bindings and multiple deliveries to receivers based on custom data and custom mappings.

In order to use custom binding you must generate binding data and save it in a collection in the lmrm__scheduler database. The publish screen allows you to generate a template for this in Excel - you can then fill in the values and use the JSON Studio Spreadsheets application to insert it into lmrm__scheduler (or use any application/driver that you are familiar with to insert the data into the database).

Bind value collections are simple to create and simple to understand. Each document (row) specifies a set of values used to bind the variables and a set of receivers. When the dispatcher runs it reads in this bind collection and generated one report/CSV for each document/row in this collection. It then delivers this report to the receivers.

As an example, suppose that you have a report parameterized by the DB User and you want each DBA to receive a report on their activity. You would create a bind collection that has a document/row per DBA where each document would just have two fields - the DB User name and the email of the DBA. When the dispatcher runs it will generate a report per DB User by using the value per document in the query thus generating activity filtered for that user. It will then email that PDF or CSV to the email specified in the second field of each such document.

You can also parameterize the subject, header and footer that will be used in email notification. Use the same syntax as you use it with queries. For example, if your query and thus binding collection has values for $$commandGroup then you can have header, footer and subject defined as shown below:

_images/bind3.jpg

Note that you cannot embed a $$LMRM_NOW in a custom binding - these must be a part of the published URL definition. The example below shows an API/report definition that has four bind variables - from, to, client and commandGroup. In the example, from and to are defined as a part of the published URL and the spreadsheet used to populate the bind var collection specifies the client and the commandGroup. It would not work to omit the $$LMRM_NOW definitions and instead add two more columns to the bind var collection / spreadsheet. Also note that the limit (or specifying -1 to remove the limit) must be done in the publish section.

_images/bind1.jpg _images/bind2.jpg

Important: The email and copies field must be an array type. If you are using the auto-generated spreadsheet for example, and do not need any copied, make sure to enter []. If you want to email to a single individual with an email address of jane@jsonar.com, enter [“jane@jsonar.com”]. The bind variable replacement values need to be in the appropriate format as you would use bind variables in JSON Studio and Sonar Gateway. As an example, if the replacement value is a group name then since a group name is a string the value must be enclosed in double quotes - e.g. “Servers belonging to Jane”. Do not include single quotes within such string values.

Starting in Version 4.0 the bind variables can be entered into a table directly on the scheduled job screen on the bind variables tab.

Also starting with Version 4.0, if you would like the bind variable values to be written into the report as part of the header (for audit purposes) use the $$LMRM__BVS placeholder. This will be replaced with a line or lines that record what the bind variables were used for the report run - for example the header field can be: “some header $$LMRM__BVS and more” which will show in the report as:

some header
Bind values: name=testbvheader; user=admin; copy_list=[ ]; file="avi"; names="Mark";
email_list=[ "ron@jsonar.com"];
and more

Using Bind Variables for Defaults

When you build a query that is published as a custom report and want the user to select a date, put a value of $$LMRM_NOW-1DAYS or $$LMRM_NOW+1DAYS in the default field. Do not use double quotes. Note that this value must have +/-1 and not another number as it only serves to create the date picker widget. This is NOT the value that will be used for the query - the user will need to put a value of a date before submitting the query.

Using $$LMRM_NOW as Values

When you have an input value (as a user) that is a date on a form or as bind variables of a dashboard you can use the $$LMRM+/-#DAYS syntax to pick the date that you desire (no double quotes).

$$LMRM_NOW Summary

There are many options to use relative dates - both as a user and as a designer; here is a summary:

  • As a user you may use the $$LMRM_NOW syntax for relative date values in forms and dashboards that expect dates; no quotes are needed.
  • As a user you may use the $$LMRM_NOW syntax for relative date values when creating a REST API that has date bind variables; double quotes are needed (e.g. “$$LMRM_NOW-2DAYS”).
  • Do not use the $$LMRM_NOW syntax when you schedule a job - these will not be bound. You cannot for example enter $$LMRM_NOW into a form of a query and schedule it. Instead, make the relative date part of the query and it will be bound when the query is invoked by the dispatcher. Another way to explain this is that you cannot parameterize the relativity of dates and have double binding (once from a bind variable to a $$LMRM_NOW and then from an $$LMRM_NOW to the actual time).
  • As a designer you may use the $$LMRM_NOW syntax for default values; no quotes are needed but only $$LMRM_NOW $$LMRM_NOW+1DAYS and $$LMRM_NOW-1DAYS are supported.
  • As a designer you may use the $$LMRM_NOW within queries; double quotes are needed. These values will be replaced when the query executes - whether on the GUI or as part of a scheduled job.
  • As a dashboard designer you may use the $$LMRM_NOW syntax for default values of a dashboard bind variable; no quotes are needed. You can also use NOW-10DAYS, NOW-17HOURS also without quotes and NOW. No quotes necessary.

Example

Assume you have a report that you need to break down by two dimensions - servers and command. You have three groups of servers (G1, G3 and G4) representing three business units and you need to deliver one report for DDL commands, one for DML commands and one for administrative commands. The spreadsheet you create looks like:

_images/bind4.jpg

In this case Jane owns the G1 servers, Joe owns the G3 servers and Josh owns the G4 servers. The report is parameterized by $$commandGroup and $$serverGroup - precisely the columns for which the spreadsheet binds values. This spreadsheet is imported into a collection in the lmrm__scheduler database - for example into a collection called mt2. Then, to schedule the report to run daily at 15:10 use:

_images/bind5.jpg

Note that the name mt1 is that used within the spreadsheet to identify the rows used for running the various reports and that the collection name mt2 is specified at the bottom of the Job Details editor.

NOTE: Do not use $$LMRM_NOW variants as parameters - use them as bind values when you compute the URL. $$LMRM_NOW is expanded before the processing of the multiple parameter lines.

Automated Delivery to Apps, Support and Integration with Kafka

Jobs are usually create to generate either PDFs, CSVs or both and are normally delivered via email or SCP. There are three more options - Support, Justify and Kafka.

When you select the Support option a CSV will be generated and encrypted using SonarC’s public key. You may then either email it to SonarC support or uload it to ftp.jsonar.com. Only SonarC support personnel that have access to SonarC’s private key will be able to decrypt the data.

When you select Justify the resulting data will be used to create events managed by the Justify application - as explained in the Justify documentation

You can stream any SonarC data to Kafka queues and topics. SonarC serves as a Kafka producer. Any pipeline can be used for this but the pipeline must produce two strings - one field called key and one called value. Then, create the job selecting the Kafka type and enter the Kafka server/port location in the Kafka Servers fields and a topic list in the Destinations to copy to field. You can publish to multiple topics by entering the topics divided by strings. For more information see kafka_.

Alerting

Any analytical query that exists within the system or that you build yourself can be used for generating alerts. Alerts are records delivered through syslog or over email that are the result of running the query and having a non-empty result set. Alerts usually embed a threshold value forming the alert basis and this determines whether or not the result set is empty and whether or not the alert gets sent.

Example

In this example lets use an existing analytic query to form an alert. The query looks at the “spanning tree” of every database login in the system - computing the number of distinct server IPs, the number of distinct client IPs and the number of distinct client hostnames recorded for every DB User in the system. This is one of the built-in outlier detection visualization showing a box-and-whiskers plot for the user span distribution with the outliers:

_images/box1.png

Behind this visualization is an analytic query. To generate an alert based on this query and a threshold navigate to custom analytics:

_images/box2.png

Choose the aggregation link on the top right:

_images/box3.png

Load the analytic query from the pipeline drop down - in this case called box_u_sip…, and click on the Publish URL button:

_images/box4.png

Change the limit to -1 and set the threshold you desire (min_span) and click on Schedule Job:

_images/box5.png

Select how often you want the query/alert to run using the cron string (below set to once an hour) and specify who to deliver the CSV to. Make sure to uncheck the “Send also when empty results” - this is the essence of the alert:

_images/box6.png

Using the time picker

_images/time_picker.png

In the top right corner of your screen you can set the global time used by the Dashboards, Reports and the Analyzer.

You can set the global time by using any of the pre-defined options using the “Quick” section or by specifying an interval of time under “Relative” or “Absolute” sections.

You can control which areas are affected by the global time using the check-boxes:

  • “Apply to Reports”
  • “Apply to Dashboards”
  • “Apply to Studio”

Selected time in the time picker will impact:

  • The default time range for data to be retrieved in the dashboards
  • The default time range of reports
  • The default time set in bind variables used by the Analyzer (a.k.a. “Studio”)

Note: in order for the global time to be active Dashboards & the Analyzer you need to set its usage for the first time:

Dashboards - for each dashboard you need to set the preference time at least once before the global time will take effect. In order to set the usage follow the next steps:

- open the dashboard
- click on “options”
- click on “Edit parameters”
- Select a date in the “from”  and “to” fields
- Click “Update Dashboard”

Analyzer - Need to have the “to” & “from” fields defined as bind variables:

- Open the Analyzer in the aggregation view
- Open the “bind” window
- Make sure you have the “to” & “from” field defined, if not add them
- Set a time in the in the  “to” & “from” fields
- Save the bind settings

Adding Custom Reports

You may add additional reports to the SonarC home page. To do so, build the pipeline or query within JSON Studio (“analyze” from the SonarC home page) and then follow these steps:

  1. Click the publish checkbox and save the query/pipeline.
  2. Click on the Publish URL button.
  3. Optionally, fill in defaults for your bind variables to make it easier for the user and specify a default limit if required. If you want to use the standard report or CSV limits enter REPLIMIT or CSVLIMIT for the limit value respectively. If you have a date field as a bind variable and you want to have the default be one day prior or one day after the time when the user runs the report use $$LMRM_NOW-1DAYS and $$LMRM_NOW+1DAYS (no quotes). Use these strings exactly as they are and note that other variants are not supported. You can use these either in the default or in the value; for user entry use this in the default. Note that the Gateway call will only be populated with these values when configured in the custom report drop down.
  4. Click on the Compute URL button.
  5. Click on the Add to Custom Reports button.
  6. Set a group and a label that will be used to organize the menus and click on the add button.

If you are logged in as the admin user you can add custom report to all users’ menus.

Configuration Parameters

To configure the SonarC GUI parameters edit <gui install dir>/jsonar/sonarFinder/WEB-INF/web.xml. All SonarC parameters start with the SonarC string. Configuration includes:

  • Various email configuration parameters.

  • Port on which SonarW is listening (sonarg.dbport) - defaults to 27117.

  • Database name for metadata (sonarg.lmrmdb) - defaults to lmrm__sonarg. You can specify a different database and even a database on a different SonarW / SonarC instance. This can be used to create a federated system where multiple SonarC systems share metadata (e.g. when you have one SonarC instance on-prem and one SonarC in a public cloud). Note that even when you specify a remote/different metadata store you still must have a lmrm__sonarg database on the local SonarW instance as that database is still used for the following data:

    1. guardium_groups
    2. Operations monitoring metadata
    3. Predefined report definitions.
  • Scheduler URI - the scheduling subsystem needs to be able to write scheduled job information into the lmrm__scheduler database. Specify this URI to enable scheduling.

  • Report type - by default reports are generated using a Google Visualization Javascript library. This library needs to be accessed over the Internet. For SonarC systems that cannot access the Internet an offline reporting feature is available. To make SonarC use this library rather than access the Internet stop the sonarfinder service, add the following to the web.xml file and start the sonarfinder service:

    <context-param>
          <param-name>sonarg.reportType</param-name>
          <param-value>offline</param-value>
    </context-param>
    

When using offline reports the _id field is excluded since it is usually for internal-use only. When you are in JSON Studio the _id will show in the reports to help you design queries but when used from the SonarC home page the report will omit the field. The same is true when you schedule a report for delivery. If you wish to override a Web service to include the _id value or used for dispatch, add forceId=1 to the “Add to URL” field when scheduling.

Developer Enhancements - Drill Downs

Depending on the vizualization type, you can add drill downs from a vizualization to underlying data as a report. The supported vizualizations are heatmaps, multi-donuts and box plots when they are a part of a dashboard. The predefined dashboards all have the appropriate drill-down configured.

If you develop new dashbaords and new visualizations you can add drill downs using metadata. A drill down is a metadata document that connects an existing viz to a pipeline saved in the system and is saved in the lmrm__drill_down collection in the lmrm__sonarg database, for example:

{
        "_id" : 22,
        "name" : "360_class_summary",
        "output" : "graph.360_class_summary_c",
        "published_by" : "lmrm__ae",
        "dd" : "name=360_class_dd&col=classifier&type=agg&output=report&published_by=lmrm__ae&bind.Server IP=__lmrm__extract_SIP&bind.Hostname=__lmrm__extract_Hostname&bind.from=__lmrm__extract_From&bind.to=__lmrm__extract_To&bind.Category=__lmrm__extract_Category&bind.Classification=__lmrm__extract_Classification&bind.Rule=__lmrm__extract_Rule&limit=10000&findlimit=10000&graphPoints=10000",
        "bind" : {
                "SIP" : {
                        "url" : {
                                "Server IP" : 1
                        }
                },
                "Hostname" : {
                        "url" : {
                                "Hostname" : 1
                        }
                },
                "Category" : {
                        "value" : {
                                "string" : 1
                        }
                },
                "Classification" : {
                        "value" : {
                                "string" : 2
                        }
                },
                ...
        }
}

The document specifies a drill down available on the viz called 360_class_summary_c based on the 360_claas_summary pipeline oublished by lmrm__ae. When a node in the viz is double-clicked a report will be opened using the 360_class_dd pipeline. Because a drill down might require values from the dashboard parameters as well as values in the node that was clicked, the bind section in the collection allows you to specify either one. Url means anything in the URL that forms the viz (hence including the dashboard parameters). Value is a value of the clicked node. The key (e.g. string above) is the operator applied to the value and the value (1 or 2 above) is the location of the value required - for example, a node in a heatmap has three values per node - the value itself used to color the node, the value of the row and the value of the column.

Supposed operators for value mapping are:

  • string: Copy the value as a string - e.g. the DB user name or the server IP.
  • stringNonEncoded: Same but without URL encoding applied to the value.
  • datePlusHour: Works on a string of the form “2017-05-07T00” and generates a date.
  • datePlusHourOneHour: Works on a string of the form “2017-05-07T00” and generates a date which is one hour later.
  • dateOnly: Works on a string of the form “2017-05-07” and generates a date.
  • dateOnlyOneAdditionalDay: Works on a string of the form “2017-05-07” and generates a date one day later.
  • ipLookup: Gets a string which is either an IP or a hostname (based on the preferences of the user) and returns the IP.
  • beforeLast: The operator takes the form of “beforeLast:”, “beforeLast@” etc. If using “:” on a string “56098-SQLNQ075:729:ORACLE” returns 56098-SQLNQ075.
  • beforeLastTrim: As above but trim any whitespaces.
  • beforeLastEncoded: As above but URL encodes the string.
  • afterLast: Operator syntax as above. If using “:” on a string “56098-SQLNQ075:729:ORACLE” returns ORACLE.
  • afterLastTrim: As above but trim any whitespaces.
  • afterLastEncoded: As above but URL encodes the string.