Skip to content

Genai lab Instructions #1666

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 5 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
117 changes: 117 additions & 0 deletions docs/en/alab/annotation_labs_releases/release_notes_6_10_0.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
---
layout: docs
header: true
seotitle: Generative AI Lab | John Snow Labs
title: Generative AI Lab Release Notes 6.10
permalink: /docs/en/alab/annotation_labs_releases/release_notes_6_10_0
key: docs-licensed-release-notes
modify_date: 2024-08-25
show_nav: true
sidebar:
nav: annotation-lab
---

<div class="h3-box" markdown="1">

## 6.10

Release date: **01-13-2025**

## Generative AI Lab 6.10: Faster Preannotation and Improved Non-Overlapping Relations
Generative AI Lab 6.10 brings an update that makes marked improvements to the speed of our Pre-annotation function. Additionally, we’ve made usability improvements to relation visualizations, allowing for a clearer view of projects.

Additional features include the ability to use annotation guidelines in HTML projects, a UI improvement to the analytics request page, and other small improvements.

## Enhanced NER Pre-annotation with Upgraded Process Pipeline
Version 6.10 focused on an upgrade to the Pre-annotation processes, having been fine-tuned, dramatically reduces processing time.

Redesigning our workflow for these processes allows for an increase in performance of up to **3x**. Benchmark data showed consistently that for datasets that were routinely being pre-annotated at a rate of 300 tasks per hour, we are now able to pre-annotate at a rate of 1100 tasks per hour. This was a necessary feature to support our customers who rely on this feature for their teams of annotators.

An additional consideration of the new pre-annotation process is the additional increase in performance to de-identification projects that utilize the same function.

## Improved Non-Overlapping Relations

The display logic for relation lines has been refined to prevent overlapping. Relation arrows and labels are now strategically spaced and arranged into tiers based on the number of overlaps for each line, providing a clean and organized visual presentation. This improvement significantly enhances readability and reduces confusion when analyzing complex relationships between text chunks

**Before:**
![6100image](/assets/images/annotation_lab/6.10.0/1.png)

In earlier versions, when multiple relations were defined between text chunks positioned close to each other, the arrows and labels representing these relations would often overlap. This overlap created visual clutter, making it difficult for users to accurately distinguish and interpret the relations.

**After:**
![6100image](/assets/images/annotation_lab/6.10.0/2.png)

The improved relation visualization feature, "**Accommodate Relation Overlap**" is now enabled by default, though it can be disabled for instances where overlapping of many relationships makes the text difficult to read. The goal of this feature is to decrease ambiguity in relations.

![6100image](/assets/images/annotation_lab/6.10.0/3.png)

## Improvements

### Redesigned Analytics Permission Request Page
To enhance user experience and clarity, the Analytics Dashboard activation process has been updated with the following improvements:

When navigating to the **Analytics** page for a project where the dashboard is not enabled, users are presented with:
- Two buttons: “**Go Back**” and “**Send Request**”
- A clear informational message:
"**Analytics Dashboard Not Enabled for This Project**
`Request the Generative AI Lab administrator to enable the Analytics Dashboard for this project.`"

![6100image](/assets/images/annotation_lab/6.10.0/5.png)

Upon clicking the “**Send Request**” button, the message updates to:

"**Analytics Dashboard Request Sent.**
`Once the request is approved by the admin, the dashboard will be available for use.`".

Also, the “**Send Request**” button becomes disabled, preventing duplicate submissions.

![6100image](/assets/images/annotation_lab/6.10.0/6.png)

When the “**Go Back**” button is clicked, users are redirected to the previous page allowing them to continue with other tasks while awaiting approval for the Analytics Dashboard or without submitting a request. These updates deliver a clean UI for users to request dashboard access, ensuring clarity and transparency in the activation process.

### Bug Fixes

- **De-identification not working in Section Based Annotation-enabled project**

Section-based annotation filters tasks by relevant sections. When such tasks are pre-annotated using models and then de-identified, the de-identified text was previously not visible in these sections, as shown in the comparison screenshot below. This issue has now been resolved, and users can view the de-identified text by clicking the **Compare De-identified Data** button and then be exported as needed.

**Before:**
![6100image](/assets/images/annotation_lab/6.10.0/9.png)

**After:**
![6100image](/assets/images/annotation_lab/6.10.0/10.png)>

- **Model Publishing Fails with Error**

Users can once again publish their models to the models hub.

- **Users attempting External Prompts in Visual NER projects**

Visual NER Projects now have validation to prevent relation prompts and external prompts from being attempted, as this feature is not currently available.

- **Users can combine Visual NER model with Rules during project configuration**

A validation error is now displayed when users attempt to add rules alongside a Visual NER model.

- **"Define What to Annotate" tab is hidden if the user tries to add/remove the External Classification Prompt**

Users can no longer add classification prompts to the visual project.

- ** Highlight Drafts on Annotation page **
The Completions section has been updated to ensure consistent capitalization throughout the interface. Additionally, the text color for these messages has been changed to orange to enhance visibility and emphasis.

Message in the Completion Tab When a Draft Is Saved:

![6100image](/assets/images/annotation_lab/6.10.0/7.png)

Message in the Completion Tab When Viewing the Last Saved Annotation:

![6100image](/assets/images/annotation_lab/6.10.0/8.png)

</div><div class="prev_ver h3-box" markdown="1">

## Versions

</div>

{%- include docs-annotation-pagination.html -%}
13 changes: 13 additions & 0 deletions docs/en/alab/training_configurations.md
Original file line number Diff line number Diff line change
Expand Up @@ -213,6 +213,19 @@ When triggering the training, users can choose to immediately deploy the model o

<img class="image image__shadow" src="/assets/images/annotation_lab/4.1.0/train_model_deployment.png" style="width:100%;"/>

## Model Versioning when Training Models
Generative AI Lab 6.9 introduces model versioning for the following project types: Named Entity Recognition (NER), Classification, Assertion, Relation, and Visual NER. In the **TRAINING SETTINGS** section of the **Train** page, a toggle labeled **Enable Versioning** is now available. By default, model versioning is disabled. To enable it, toggle **Enable Versioning** to **on**.

![690image](/assets/images/annotation_lab/6.9.0/5.png)

When enabled, models are saved with versioned names following the format **projecttype_projectname_v1**, **projecttype_projectname_v2**, and so on. If model deployment is enabled after training is complete, the most recently trained model is automatically applied to the project configuration. If model deployment after training is not enabled, the project configuration remains unchanged. All versions of trained models are accessible on the Reuse Resource page, allowing users to browse and select specific model versions for reuse in other projects.

![690image](/assets/images/annotation_lab/6.9.0/6.png)

Model versioning is also supported for previously created projects. If versioning is disabled, subsequent training overwrites the most recent model without creating a new version. When re-enabled, versioning resumes from the latest version rather than starting over from v1. This feature simplifies model management by enabling version tracking and reusability, offering seamless integration for new and existing projects.

Note: The **Enable Versioning** toggle is disabled during training.

#### License Requirements

Visual NER annotation, training and preannotation features are dependent on the presence of a [Visual NLP](/docs/en/ocr) license. Licenses with scope ocr: inference and ocr: training are required for preannotation and training respectively.
Expand Down
4 changes: 2 additions & 2 deletions docs/en/alab/version.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ seotitle: Generative AI Lab | John Snow Labs
title: Bundled Libraries Versions
permalink: /docs/en/alab/version
key: docs-training
modify_date: "2024-04-8"
modify_date: "2024-12-11"
use_language_switcher: "Python-Scala"
show_nav: true
sidebar:
Expand All @@ -31,4 +31,4 @@ The table below displays the bundled version of NLP Libraries associated with va
| 6.0.0 - 6.2.1 | 5.3.1 | 5.3.1 | 5.3.0 | 1.0.0 | 1.0.0 | 3.4.0 |
| 6.3.0 - 6.6.2 | 5.3.2 | 5.3.2 | 5.3.2 | 1.0.0 | 1.0.0 | 3.4.0 |
| 6.7.0 - 6.8.2 | 5.4.0 | 5.4.0 | 5.4.0 | 1.0.0 | 1.0.0 | 3.4.0 |
| 6.9.0 - latest | 5.4.0 | 5.4.0 | 5.5.0 | 1.0.0 | 1.0.0 | 3.4.0 |
| 6.9.0 - latest | 5.4.0 | 5.4.0 | 5.5.0 | 1.0.0 | 1.0.0 | 3.4.0 |