AI Trainer
AI Trainer is Agiloft's proprietary platform for training AI models. With AI Trainer, you can customize the type of content that AI can extract from contracts.
This process works by feeding an AI model a dataset that contains examples of what you'd like it to recognize. Once it has been trained on that dataset properly, it will be able to recognize similar content.
Using AI Trainer
Consult the following sections for information on how to use AI Trainer's different workflows.
Creating an AI Project
All trainings go through the AI Project record. To create an AI Project:
- In your KB, navigate to the AI Projects table.
- Click New.
- Choose AI Label Training from the Type drop-down list.
- Name the project.
- Click the search icon next to Models.
- Select one or more label-finding models you'd like to train. You can add as many labels as you like to the same project. The models you select here represent what you will be able to tag in your training set documents. You can also create a new record here, as long as you verify it hasn't already been created. If you need to create a new Model record:
- Select Label Model from the Create new drop-down at the top right-hand corner of the Model search.
- A modal window opens. Add a Name.
- Choose a Model Type.
- Optionally add a value for Applies to Document Types based on the type of contract that this training applies to. Leaving this field blank indicates that this is a generic label-finding model that can be used on multiple contract types.
- To the right of Associated Label Library Entry, either:
- Click Link to existing Label if you already have a relevant Label record in your Label Library table that this Label Model could be stored under.
- Select the proper Label record.
- Click Import/Append.
- Click New if you do not have a relevant Label record in your Label Library table.
- Add a Name.
- Add a Label Type.
- Click Save.
- Click Link to existing Label if you already have a relevant Label record in your Label Library table that this Label Model could be stored under.
- From the Add Documents From drop-down list, select the location of your document set. While it's convenient to have them in one source, you can add these documents from any combination of the following places by changing the drop-down list value after the documents have been added from a source:
- Upload from your computer: select contracts saved on your computer to use as training data.
- Another AI Project: use contracts that were already added to an existing AI Project record.
- Agiloft Contract Lifecycle Management: select contracts that already exist as Attachments in your KB.
- Click Attach/Manage or drag and drop files to add documents.
- When you've added all the files, click Add Documents. They now appear in the Training Set Documents section of the project with a status of Processing, where they are evaluated for Document Quality and Document Type. When a document has been fully processed, the quality and type are updated and the status changes to Ready to Annotate. Any documents that can't be analyzed are shown on the Flagged Documents tab. Remember: variety is more important than the quality of a single document, so if you see that OCR has done a poor job and the data seems off around the words you would like to label, it is usually best to unlink this document from the project.
- Then, begin annotating the Ready to Annotate documents according to your project's labeling policy. For more information about completing an AI Trainer project in Agiloft, consult your Agiloft representative.
- Once the documents have been properly annotated, you can click Start Training from the Training tab.
Generate AI Suggestions
You can use the Generate AI Suggestion button from the Training Set Documents embedded table to find examples of your label in newly added training set documents. You can view these suggestions in the document viewer under the AI Suggestions tab.
This button uses the current Active version of the model to automatically tag data in a selected document in Ready to Annotate status. This is why you can not generate suggestions your first time training a model.
This is a good way to test the label on specific documents, as well as streamline the annotation process.