Scenario:
I'm working on a project that needs to accept large amounts of data (customer data) from its users. So it can be normal to have a user trying to add 10,000 or 100,000 records at a time. In some other forms its normal to add 2 or 3 records at a time. In some others 50 to 100.
Problem:
How can we prevent our website forms from robots(or humans) that try to add massive data in order to fill my database resources with wrong or useless data.
Possible solutions that I have ruled out:
Limiting the amount of data is currently not an option, as mentioned. Using CAPTCHA for each form is very manually intensive, as there may be 10000 forms
The question:
So what options do I have to prevent robots/automated tools from accessing the system?