"Storing data throws batch update error"
mario_playing_w
New Altair Community Member
Hello,
currently I am building a few processes on rapid analytics. When storing larger data sets I encounter a strange problem. As soon as I try to store more than 400k cases (5 attributes) the process fails on rapid analytics an throws
"javax.ejb.EJBException: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update."
Is my data simply to large or my ram insufficient? I already worked on larger datasets, so this error while storing date seems strange to me.
Does anybody knows this error and maybe a workaround for it?
Thanks!
Mario
currently I am building a few processes on rapid analytics. When storing larger data sets I encounter a strange problem. As soon as I try to store more than 400k cases (5 attributes) the process fails on rapid analytics an throws
"javax.ejb.EJBException: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update."
Is my data simply to large or my ram insufficient? I already worked on larger datasets, so this error while storing date seems strange to me.
Does anybody knows this error and maybe a workaround for it?
Thanks!
Mario
0
Answers
-
Hi, Please provide the following details so that we can understand the issue better.
1. What database are you using? What is the version?
2. How are you saving the data? Are you using a write DB operator or you are storing in the repository? Providing a sample process would also be helpful to find the cause for the issue.
0