Optimizing System Performance

Agiloft can support thousands of users and generate over 200,000 records per hour on an inexpensive server, as detailed in the Scalability and Redundancy document. In such an environment, it's often helpful to take steps to ensure that your system performance is optimized. To achieve optimum performance, use the methods and tips in this article for your on-premise deployment of  Agiloft, especially if you believe you're experiencing reduced system performance.

If you are not using our hosted service and have  Agiloft installed on your own server, make sure to first complete the performance test described in the Test Basic System Performance section before doing anything else. This test determines if your system meets basic performance standards, and you should ensure it generates acceptable results before you attempt to optimize your setup.

Prerequisites

Depending on the performance improvement method, you may need the following:

  • Access to the server file system
  • Access to the admin console
  • Knowledge of MySQL

Test Basic System Performance

If you are using our hosted service, there is no need to run this test because all our servers exceed the basic performance requirements.

If you have  Agiloft installed on a server inside your firewall, you have access to the admin console. This allows you to run a quick test to confirm that your basic setup provides adequate performance before you spend time on optimization.

This test is entirely independent of your KB; it is designed to find issues with the hardware and system configuration. The test takes about ten minutes and strongly impacts performance while it's running, so it is best performed outside of business hours. To run the test, log into the admin console and go to Debugging > Performance > Run Performance Test.

Performance Test Results

When the system performance test is complete, you receive a result like this:

Example performance test results

The test measures performance with three metrics, each of which depends on an aspect of your system:

  • Time taken to import KB: Input/output (I/O) of the server. An acceptable result is 300 seconds or less.
  • Time taken to generate 4,000 records: Number and speed of CPU cores. An acceptable result is 200 seconds or less.
  • Time to render reports: Disk speed for read operations. An acceptable result is 500 seconds or less.

If your results are greater than the thresholds mentioned above, you are likely to experience suboptimal performance. Several common reasons may explain suboptimal performance results:

  • The hardware does not meet the recommended system requirements.
  • You are using a database on a remote machine that is slow or is connected to the  Agiloft server by a slow connection. If you need to use a remote database, we recommend using a 10G Ethernet or Infiniband connection. The best performance is achieved by using the local copy of MySQL recommended for download by the installer.
  • You are running in a virtualized environment and other virtual machines are using a lot of resources.

If you are running on a server with a local database, SSD drives, and fast CPUs, you should get results similar to the following:

  • Time taken to import KB: Around 130 seconds or less.
  • Time taken to generate 4,000 records: Around 115 seconds or less.
  • Time to render reports: Around 480 seconds or less.

Reduce the Amount of Transferred Data

Table and record view loading times are determined in part by the number of records and fields loaded onto the screen at one time. Each record and field represents a piece of data transferred from the database to the browser. If you experience slow loading times in certain tables or when opening certain kinds of records, the following tips can help reduce data transfer and improve performance:

  • Optimize table views: From a table's action bar, hover over Views, click Edit, and minimize the amount of data displayed in a table view as follows:
    • On the Fields tab, only display fields that you really need to see.
    • On the General tab, display fewer records per page, such as 25 rather than the maximum of 150.
  • Set table refresh rates to Never: Limit how often view data is refreshed by navigating to Setup > Access > Manage Teams, editing a team, and then setting the Table View Refresh Rate to Never. Even with Never selected, the table view refreshes each time a user navigates back to the table. Table view refresh rates also affect the system's timeout behavior for user inactivity; if table refreshes are enabled, the user's session never times out.
  • Reduce the amount of data transferred when records are edited: Related Tables in a record layout can slow down record opening. To minimize this impact:
    • Move any Related Tables or Embedded Search Result tables from the first tab of the layout so they can load in the background.
    • If you have linked fields with more than 50 values, change the display type to a box with lookup so the list doesn't have to be loaded.

Use a Fast Browser

Different browsers provide vastly different performance. Firefox is the recommended browser and is available on all platforms, except for mobile devices. Safari and Chrome are as fast as Firefox and are fully supported. IE11 is significantly faster than IE9 and is fully supported, but it's still slower than Firefox, Safari, or Chrome.

If you are using Firefox, we also recommend you disable the following add-ons, which are known to cause performance problems:

  • Firebug
  • ColorZilla
  • Adblock Plus

Store Email Addresses and Names Separately

If you want to use an email field to define record ownership, it's best to store email addresses and names in separate fields. To do so, go to Setup > Email and SMS > Configure Inbound Email, edit a configuration, and go to the Record Mapping tab. An option here allows you to store email addresses and names in separate fields:

When the email address and name are stored separately, the system parses email record ownership, which prevents errors due to name-sensitive email recognition and issues with indexing. 

Optimize Time-Based Rules with If-Then-Else Actions

When time-based rules with If-Then-Else actions operate on a large number of records, and the majority of those records are likely to be unchanged by the If-Then-Else action, performance can suffer drastically. This is because each record must be read from the database before the If-Then-Else action can operate on it. To avoid performance issues caused by this setup, create a saved search on the Condition tab of the rule that only finds records that some action within the If-Then-Else action actually operates on.

For example, the following rule is highly inefficient if there are many records with a status of Open and only a few records with Urgent or Critical priority:

Condition: Find all Open records

Action:

  • If (Priority == Critical) Then Send Escalation Email to Team Leader
  • Else If (Priority == Urgent) Then Send Escalation Email to Team

This rule causes the system to parse all records with a status of Open and ultimately ignore them unless their priority is Urgent or Critical. Instead, create a more efficient rule with a condition that operates on much fewer records:

Condition: Find all Open records where Priority == Critical or Priority == Urgent

Tune the Database

If you are comfortable with databases and have a good feel for the makeup of the data and which searches are used most frequently, it can be very helpful to add database indexes. If you are unsure or need the changes to be made outside of working hours, it may be worth using Agiloft consulting services to take care of this task.

If a certain field is often used for searching, and the number of matching records is much lower than the number of records in the knowledgebase, you can improve search speed by adding a database index for that field.

For example, adding an index to the Assigned To field increases performance if there are 100 records assigned to a particular team or individual in a database of 100,000 records. However, if there are 50,000 records assigned to that team or individual, adding indexing does not help at all and may actually hurt performance.

Although database indexes can be very effective, you have to be careful because:

  1. Database locking makes the table unavailable while you are adding the index. Depending upon the number of records in the table, this may take between a couple of minutes and several hours for tables with hundreds of thousands of records. For this reason, consider adding an index outside of working hours.
  2. Inappropriate or excessive indexes actually hurt performance. In general, you should not have more than 7 or 8 indexes per table. In some cases, such as searches for "Assigned to == $global.my_Full_Name and Status == Open," it may be most efficient to add a compound index.

For more information on creating indexes for a table, see Indexing.

Construct Optimal Indexes

The optimum index for queries reported in the ewoptimizer.log, or obtained from the MySQL admin report on long-running queries, is generally one that includes the fields from the query and in the same order that they were specified in the query. Creating the best index does require some experience and judgment; if it did not, the standard would be for the database engines to create indexes automatically.

Find Long-Running SQL Queries

SQL queries that are taking a long time to execute can indicate a performance issue. To identify long-running queries, check AL_HOME//mysql/logs/ewmysql-slow-queries.log for queries that take more than 10 seconds.

If certain workflows cause a slowdown of performance, use the admin console to turn on Debugging, run through the steps, and check the server.log.

Load Testing and Performance Optimization

We provide consulting services to load test and optimize your application. These range from reviewing the log files and creating or modifying indexes to simulating user load and fine-tuning performance based upon this load. The optimization process can range from one day to several weeks, depending on whether you want us to simply review log files and create or modify indexes based on the resulting information, or to create test programs that simulate production use and precisely measure the number of concurrent users that your server can support.