Cursor for Everything

USE CASEAI
10 min read

Here at DoltHub, we are fond of saying “Dolt is the database for agents” ever since we discovered Claude Code. But what exactly does that mean? Well…Agents need branches. Agents need tests. Agents need clones. Agents need diffs. Agents need version control.

But pragmatically, how would you use Dolt to make your application agentic? Cursor is the standard user interface for LLM-powered applications. Dolt allows you to build Cursor for Everything. Cursor for Quickbooks. Cursor for Justworks. Cursor for Stripe. Cursor for Google Analytics. Cursor for AWS. These are just examples of the applications I use to run DoltHub. This article will explain how.

Cursor for Everything

What is Cursor?#

As evidenced by their latest raise, Cursor is the defining Artificial Intelligence (AI) application of this cycle. Cursor is a Large Language Model (LLM)-assisted Integrated Development environment (IDE) used for helping software developers write code.

Cursor example

Cursor started with a familiar user interface (UI), Visual Studio Code, and added a chat window on the right side. The user interface was not just familiar. It was the exact Visual Studio Code user interface. Cursor was a fork of Visual Studio Code, which is open-source. I think this matters. Software developers didn’t have to learn a new user interface to adopt AI to help them write code. AI appeared right next to the tool they were already used to.

The chat window on the right side of the familiar IDE was connected to an LLM. A set of tooling was developed to allow the LLM to search, inspect, and edit the code files opened in the IDE. Through a chat-based interface, the user could ask the LLM to perform tasks on the users behalf. For instance, you could ask the LLM to explain what was in the code and the LLM would craft a reasonable explanation.

Moreover, you could ask the LLM to make code changes on your behalf. Make this function. Connect it to this interface. These changes were cleanly displayed using the IDEs built-in version control display functionality. You could click on a file clearly labeled as edited and see what the LLM changed. If you didn’t like what the LLM did, no big deal. You use familiar version control functionality to roll back the changes and start over.

“Agent mode” is the latest Cursor innovation. Whereas earlier versions of Cursor were prompt/response, “agent mode” looped on LLM output until a user-set goal was achieved. This allowed user to give more complex tasks to the Cursor LLM. Cursor would perform multiple steps on the users behalf. Change this code. Compile. Run this test. Go back to the code to fix the test. Report to the user the task is complete.

Agentic coding flowchart

That’s where we are today. Cursor is glorious having ushered in the vibe coding era.

Chat on the Side#

Cursor’s victory in the IDE category is so complete that the “chat on the side” user interface is becoming the default interface for adding LLM’s to your application.

App Builders like Vercel v0, Replit, and Lovable all adopted this user interface. App Builders allow users to use a chat window to build a web-based application. A starter prompt would be something like “I would like a website to manage the inventory for my business”. The App Builder will build a functioning website using user prompts.

In the App Builder case, the UI is similar, but the sides are flipped. You have the chat window on the left and the web application you are building on the right.

Replit

You even start to see the “chat on the side” user interface in Google Apps, although they haven’t moved to the full chat window. As you can see, Google suggests things you may want to do in that window. Google kept the chat on the right side.

Google Sheets Chat

I think this user interface is so intuitive and powerful that it will quickly become the standard for the next generation of applications.

Cursor for Everything#

Last week, Austen Allred of Gauntlet AI took to X to talk about his plan to turn around Marin Software, a performance marketing application Gauntlet AI acquired. Gauntlet AI acquired Marin Software to test whether a team of AI experts could fix the struggling company quickly.

Cursor for Ads Tweet

I can see the user interface as he describes the application. The normal performance marketing application on the left with an LLM-powered chat window on the right. I think this is a good idea. I suspect engineers and product designers everywhere are having a similar epiphany frequently these days.

Cursor for Everything

How?#

It’s becoming quite simple to add a chat window on the side of your application. I’m assuming your application has a standard service oriented architecture.

SOA

1. Add a chat panel to your UI#

You can even do this using a vibe coding tool like Cursor. Add a panel on either the right or left side of your existing application. Fill the panel with a text box and allow the user to enter a prompt.

2. Connect the chat panel to an LLM#

We recently built a simple list making application showing off how to do this. The foundational model companies like OpenAI and Anthropic build libraries and APIs to make this easy for you.

You may decide to include not only the user prompt in the payload you send to the LLM but also additional context about the application such that the LLM generates a better response.

Once you do this, your LLM-powered chat panel will be able to answer questions about your application. There will be read access. The ability of the LLM to answer questions about your application will be dictated by the context you send to it.

3. Expose your application’s API to the LLM as tools#

Now it is time to allow the LLM to make writes. LLM frameworks expose a concept called tools. Tools are operations the LLM can make on the users behalf.

The common standard for exposing tools to LLMs is the Model Context Protocol (MCP). So, you may need to incorporate an MCP API into your service. The good news is MCP is very similar to other web request frameworks like REST, GraphQL, or gRPC.

The big question here is, what operations do you expose to the LLM as tools? Read operations are likely safe to expose minus data privacy concerns. Unless you are running a local LLM, you are sending potentially sensitive data to an LLM provider. But what about writes? This is where the “Cursor for X” vision gets tricky. The true power of Cursor is not having an LLM explain code to you. The true power is in having Cursor write code for you. But, LLMs make mistakes. How do you protect your application from these mistakes like Cursor does?

LLMs Make Mistakes#

As we all know at this point, LLMs make mistakes. In fact, it’s not that difficult to come up with an example. I kid you not, this mistake was generated on my first attempt with Google Gemimi in Sheets. I first asked it to make a sheet with all US Presidents.

Google Sheets All Presidents

So far so good. I then asked Gemini to remove “Thomas Jefferson”.

Google Sheets Killed All Presidents

It mistakenly deleted all the newly created data. I even hit Cmd-Z to undo and Google Sheets would not even undo the change. It seems whatever interface Gemini is using is not hooked up to the user undo system. I was however able to recover the previous state by instructing the LLM to put it back.

There are a couple of issues at play here in the Google Sheets example. It is not easy to see what Gemini changed. I just see a blank screen after it made its mistake. I should see a diff view until I accept the changes the LLM proposed. Secondly, it was not easy to undo the change. I should be able to reject the proposed changes and roll back to any previous state easily. These are both problems that are solved in Cursor. Cursor is operating on version controlled files. Calculating differences and rolling back are handled by your version control system, not Cursor itself. Cursor just provides the interface.

Cursor for Everything Needs Version Control#

So, why is Google Sheets missing diff and rollback? Because version control is a hard technical problem. There is no version-controlled spreadsheet on the market. Even for Google, it’s easier to add a chat window to Google Sheets than it is to add version control to Google Sheets.

But, there is a version-controlled SQL database. I bet the application you want to add Cursor to is backed by a database.

In the above example, if Google Sheets was backed by Dolt, Dolt could easily calculate the diff of what changed. Google Sheets could display something like the screenshot below to the user. This is the Dolt Workbench user interface.

Dolt Deleted Presidents

So, let’s continue our above how to.

4. Migrate your backend database to Dolt#

If you are using a MySQL or Postgres compatible database, this step is easy. Dolt comes in both MySQL and Postgres flavors. Take a dump of your application’s database and import it into Dolt. Start a Dolt SQL Server. Change the connection string in your service to point to the new Dolt database. Your application should work exactly like it always has.

If you’re using another backend database, you have more work to do. If you are using SQL, hopefully your Object Relational Mapper (ORM) makes migrating databases easy. If you are a NoSQL user, you have a NoSQL to SQL migration on your hands. Modern SQL databases support JSON as a native type so this is not as hard as it used to be. It’s still hard but maybe it’s worth it to get LLM powered writes with version control into your application?

5. Use Dolt primitives to expose version control functionality in your application#

Finally, it’s time to start modifying your application code to take advantage of Dolt’s version control functionality. All the Git operations your know and love are exposed as SQL procedures, functions, and system tables.

The first thing to do is decide where your application wants to make Dolt commits. These can be as granular as on every SQL transaction to much more user driven, like after the user clicks a create or update button. A Dolt commit is a permanent marker in your version control history. You can only branch off of and roll back to Dolt commits.

Then, you want to make the APIs the LLM is calling branch-aware. The LLM should only make writes on a branch. This will give you the ability to diff that branch against the main branch of the database and display what the agent changed to the user. Diffs are queryable like other tables so you have the full power of SQL to craft a bespoke diff UI in your application.

If the user likes the changes the LLM made, your UI should expose merge functionality to accept the changes. Dolt has sophisticated conflict detection so you can be sure an LLM won’t stomp a real user’s changes. If the user doesn’t like the LLM’s changes, just delete the branch and show the main branch in your application.

Dolt exposes the full power of Git. So, your application can have very complicated Git workflows built-in if you’d like. Your application can integrate human in the loop review via a pull request like workflow. Your application can expose the full commit log and branch model to your users. It really depends how complicated you want to make version control in your application.

If you need inspiration for how your UI could look, try the Dolt Workbench. All functionality exposed in the Dolt Workbench is available via SQL. You can build it into your own application.

Conclusion#

To build Cursor-like functionality in your application, you must:

  1. Add a chat panel to your UI
  2. Connect the chat panel to an LLM
  3. Expose your application’s API to the LLM as tools
  4. Migrate your backend database to Dolt
  5. Use Dolt primitives to expose version control functionality in your application

We’re so committed to the idea of Cursor for Everything that we’re adding an LLM-powered chat window to the Dolt Workbench. Stay tuned for a sample application based on GraphQL and React. Ready to build Cursor-like functionality into your application? Come by our Discord and tell us about it. We’d love to help.

JOIN THE DATA EVOLUTION

Get started with Dolt

Or join our mailing list to get product updates.