I am working on migration tons of old data to DeltaDNA and want to clarify how to do such migrations best way.
So, we have more than 50 tables, for example, one of them contains ~13 millions records. Is it the correct way, just to perform bunch of bulk requests (probably it's ~2600 bulk requests for this given table)? And the same way for other tables?
Also, when I used your Interactive Even Validator I noticed it shows an error "Event timestamp outside valid boundaries - eventTimestamp" if I used quite old dates in eventTimestamp field.
It is configurable and can be adjusted correctly to accept old data?
I believe my colleague has been in touch with you directly but eventlists sent in bulk should never exceed 5 mb and all events that are recorded for longer than 32 days ago are marked as invalid. We can if you like set a custom value for this so you can ingest this legacy data, this is however a manual change so please contact us directly via firstname.lastname@example.org