Creating Dynamic Data Visualizations with OpenAI's GPT-3 and React
For the past few years, I’ve been mainly developing solutions related to data visualization. When the latest OpenAI GPT model was released a few months ago, my manager called me on hype because he had been trying it with some CSV datasets, and the model seemed to understand and summarize the data.
I started to play with the completion API right away, and after a while, I realized that I could give context to the AI and make it respond only using JSON responses.
Then, the idea just clicked: If I send some dataset to this new AI and ask for a specific JSON response, I can create a dynamic data visualization dashboard generator that will generate filters, indicators, and appropriate charts for the data at hand.
How to talk with GPT?
At first, I struggled because I didn’t know how to talk with GPT, so I loaded one of the examples in the playground called “Chat”, then started to paste some data and ask questions about it.
After a little investigation, you apparently could change the headers of these conversations and make them answer differently, giving it the profile of an expert on data science, for example.
Make GPT answer with JSON responses
Then the idea popped, I could give it a header saying this:
…then the AI responded
And that’s it; I only need to parse that data and render it on React.
Going a little bit further
In data-visualization dashboards, we usually don’t have charts only. Some other present objects are filters and KPIs (key performance indicators).
So maybe if I tweak the input a little bit:
And to my surprise, it worked:
Try it!
At Leniolabs, we have implemented a version of this idea; of course, we made it open-source, so you can also try it in your local environment.
To analyze your datasets, you will need to use your own OpenAI’s API KEY.
Conclusion
I’m amazed… scared too. Because all we discussed here is something that can be achieved with a query that costs as little as $0.02. If we split this into several queries and give more context about the data, the possibilities are endless.
The key point is that with this approach, you don’t need to send ALL the data to the endpoint with just a sample of a few records. It’s enough since all the calculations will be handled later in the app.
Leniolabs_ LLC
7901 4TH St N SUITE 300,
St Petersburg, 33702 infodelete@leniolabs.com 541-288-4033