Catalog of SQL Notebooks
As data proliferates in modern organizations, and as more people work with data in their day-to-day operations, it has become crucial to facilitate access to data assets. SQL is the programming language used by the most popular database servers worldwide, and SQL editors are the interfaces that provide access to the data. Thus, democratizing access to data is synonym with making SQL accessible to a wide range of people.
This explains the recent explosion in the past few years of SQL editors. This new trend is not going to stop, and we'd rather bring visibility and structure soon.
At CastorDoc, we believe the first step to structure the SQL editors tools market, is more transparency. For that reason, we put up a list of all the SQL IDEs we heard of. More context on this analysis and the state of the SQL clients ecosystem here.
This list is still exploratory, may contain errors, or lacking information. Please reach out to us, if you notice anything wrong: louise@castordoc.com
In-depth analysis and evolution
Read the full breakdown by generation and market analysis of SQL editors here.
What does Reddit think about SQL editors?
You can find the thread here.
Dive deeper into ETL and EL-T tools
Processing: Does the solution propose batch processing, stream processing, both?
Number of data sources: From how many applications can the ETL tool export data?
Observability: Does the tool allow you to see how your syncs are performing at large? Can you easily identify when sync fails and why? Can you get an alert when sync fails?
Transformation: Which kind of transformation does the solution offer? SQL, Dbt, both?
Custom connector: Can the solution build and maintain a custom data connector upon customer request?
Modeling: How is the data warehouse queried using the solution? Is it pure SQL, or does the solution have easy mode/ no-code features such as drag and drop?
Incremental synchronization: When synchronizing the sources with the destination, does the solution offer the possibility to only synchronize data that has been modified since the last export, or does it send all the data in the segment each time
Community: Is there an online community around the tool? How helpful and knowledgeable are the responses found in this community?
Features:
Capture deletes: Data deleted in the sources can still be accessed in the destination although it is marked as "deleted" in the destination
Custom data: The solution replicates your custom data. Custom data includes custom objects, tables, and fields that you have configured in the source system to better suit your business needs.
Re-sync: Re-sync all your data from scratch.
Data blocking: Prevent some tables or columns from replicating in your destination. Make sure only relevant data is synced to the destination, and that PII information remains protected.
History: The solution offers the possibility to see how your data changed over time
Import API: Data sent to the Import API is processed and sent to your destination through the solution like data from any other integration. Possibility to pull in data from any rest API.
Automatic scaling: High availability infrastructure which can process billions of records every day
Change data capture: Change data capture (CDC) is a process that captures changes made in a database and ensures that those changes are replicated to a destination such as a data warehouse.
Failed syncs: Do you get a notification when sync fail?