Tables Limits & Importing CSVs 📝

Tables Limits & Importing CSVs :memo:

Hi there our lovely community :star_struck:

Tables now have limits, and they are as follows:

  1. Records per Table: 1500 record.
  2. Fields per Table: 15 fields.
  3. Tables per Project: 20 tables.

These limits are for our cloud users, and they are set like that by default for self-hosters and they can be edited through their respective environment variables.

You can now also import CSVs into your table:

Let’s know what you think in the comments below :point_down:

2 Likes

Thanks for the update! :raised_hands:
Just a quick thought — 1500 records per table feels quite limited compared to something like Google Sheets or AiTable which allows up to 50k records per sheet.

Is there a particular reason behind this cap?
Would it be possible to increase the limit, especially for larger workflows?

Also would it be possible to be able to completely delete all records from a table ?

Appreciate your work as always :pray:

1 Like

20 table limit per account (if not on an enterprise plan) :skull:

Was hoping this could be used like a Make datastore. Tables per account/project should be unlimited imo, with any limits being a cumulative total across all tables (+ any individual table limits, if needed, due to technical limitations).

50 tables with 5 rows < 20 tables with 1.5k rows.

1.5k rows per table is low, but at least usable for storing variables/memory and params. 20 tables per account (non-enterprise), basically means this feature is dead to me unfortunately.

1 Like

Hi there @Rida.
We are trying to optimize retrieving, inserting and updating records, once that’s done we will take another look at the limits and increase them.

You can currently delete all records in a table by hitting the checkbox at the top left corner of your table then hitting delete, we will take this feedback and improve the user experience, we really appreciate it for letting us know :pray:

Hi there @log thank you for your feedback on the tables per project limit.

As I mentioned in my previous comment these limits will be improved once we do some optimizations, would you please elaborate further on your use-case so we can further understand the issue with the tables per project limit?

Possible Bug – CSV Import Doesn’t Respect Quoted Fields Containing Commas

Description:

When importing a CSV file into an ActivePieces Table, fields that contain commas are being incorrectly split across multiple columns, even when those fields are properly enclosed in double quotation marks, as per standard CSV formatting. (And, if you manually add quotes, they double up on import, as I assume AP already adds these by default?)

Steps to reproduce:

  1. Create a CSV file with values such as:
    URLs,Titles,Descriptions,Name
    https://example.com,Test Page,"This is a test, with commas",Test
    
  2. Import it into a Table via the CSV Import feature.
  3. Map the columns accordingly.
  4. Observe that the “Descriptions” field is split at the comma, and the portion following the comma is incorrectly placed into the next column — despite being within quotation marks.

Expected behaviour:
The CSV parser should treat any field enclosed in double quotation marks (") as a single value, even if it contains commas — which is the expected behaviour based on CSV standards.

Actual behaviour:
The parser appears to split fields on every comma, including those within quoted fields, resulting in broken imports and misaligned data.


Suggested resolution:
Please update the CSV parser used for Tables to correctly support quoted field logic (in line with RFC 4180). Specifically:

  • Fields enclosed in double quotes should be treated as one unit.
  • Escaped quotes ("") within fields should be parsed correctly.

Impact:
This makes it very difficult to import any descriptive text such as descriptions, product details, or content that naturally contains commas. It’s currently a major limitation when working with real-world CSV data.

Kind regards,