Try This: Let Your AI Agent Read Charts, Not Just Data

PR
pradeepta
2 min read

I was talking to a friend this evening, and we were brainstorming how to use AI agents to do a pretty complex analysis. His initial thinking followed the traditional data workflow: you gather data, load it into an offline warehouse, and then run analysis on top of it. Historically, you’d hand-code that analysis. Now, you might feed the data to an AI model and give it specific instructions to generate insights.

But that led us to a different way of looking at things.

Humans Don’t Stare at Raw Data

When humans analyze something complex, they’re usually not looking at raw tables. They’re looking at visualizations: charts, graphs, dashboards, network diagrams, and so on. In many real-world scenarios, the “analysis surface” is a set of visual artifacts, not a CSV file.

So why are we always feeding agents raw data instead of what we ourselves actually use?

A Simple Experiment to Try

For your next experiment, try this: instead of giving an agent (like ChatGPT or Claude) your raw data, give it screenshots.

Screenshots of:

  • Dashboards
  • Complex charts
  • Graphs and visualizations

You might be surprised by how human-like the answers feel—and how accurate they can be—just from those screenshots alone.

Give it a try.

More from pradeepta

View all posts