Scan, understand, and activate on-screen data with Lens
According to Gartner®, “Over 75% of organizations state that AI-ready data remains one of their top five company investment areas in the next two to three years.1 As AI reshapes business, this is imperative to eliminate fragmented systems, delayed insights, and siloed data that prevent them from solving the complex problems employees face every day.
Too often, siloed data is on an employee’s screen, visually connected to the employee but disconnected from the AI and automation needed to expedite problem resolution or task completion.
That’s why I’m excited to introduce Lens from ServiceNow. It extends Workflow Data Fabric to scan, understand, and activate your enterprise data no matter where it resides—including on employees’ screens.
What is Lens?
Lens is a native Windows, Mac (both available now), and mobile (coming soon) vision large language model (LLM) powered app natively integrated with ServiceNow lists and forms.
It captures anything on a user’s screen—email, website, error log, diagram, document, dashboard, image—to instantly transform it into actionable intelligence in ServiceNow workflow and task automation or to provide standalone, ad hoc analysis and answers.
This cutting-edge tool and Now Assist skill empowers requesters, agents, and developers with visual automation. It enhances processes, improving efficiency and unlocking intelligent insights across the ServiceNow ecosystem.
Users can extract on-screen data and use it to autofill any ServiceNow form to create or update records. In addition, Lens provides the ability to synthesize and answer questions and respond to custom prompts about the data in the Lens application.
How does Lens work?
Imagine a user sees an error log on their screen and needs to create an incident. Previously, they would have had to interpret the error log, write a description, and fill in the incident form.
With Lens, the user can click the “Create with Lens” button right from the incidents list. The same form opens, but an “Analyze screens using Lens” dialog comes up.
The user points Lens to the error log and, since Lens was opened from the incident form, it automatically creates a Now Assist prompt, which the user can modify as needed.
When the user clicks “Analyze,” Lens scans the error log and automatically fills in the incident form with all the details. Lens can also suggest a resolution if prompted and enter that in Resolution Information. This saves the user precious time, provides a more precise description to avoid back-and-forth conversations for clarity, and expedites incident resolution.
Key features of Lens
Lens is poised to revolutionize the way organizations handle visual data. It’s an indispensable tool for all employees to address complex issues and tasks with ease, increasing productivity and process efficiency through these features:
- On-demand data extraction allows users to extract data from multiple sources in real time, providing instant access to critical information to fill forms or act upon it.
- Instant form or script filling automatically fills forms or generates scripts based on the extracted data, improving efficiency.
- Support for custom prompts allows users to provide custom prompts before analyzing data, ensuring tailored outputs that meet specific requirements.\
- Multicapture captures data from multiple sources or scans and combines it to generate summarized output.
- Standalone mode lets users open the Lens app on their desktop or mobile device, point Lens to anything on the screen, enter ad hoc prompts, and get instant analysis, answers, and summaries.
- Secure user session requires specific roles and permissions to prioritize security.
- Multipersona usage caters to different personas, including requesters, agents, and developers, to deliver unparalleled value across the ServiceNow ecosystem.
Get Lens from the ServiceNow Store to put AI to work for your people.
1 Gartner Research, CIO Guide to AI-Ready Data, Jan. 8, 2025
GARTNER is a registered trademark and service mark of Gartner Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.