OutOfMemory « Batch « JPA Q&A





1. OutOfMemory when reading big amounts of data using hibernate    stackoverflow.com

I need to export big amount of data from database. Here is classes that represents my data:

public class Product{
...

    @OneToMany
    @JoinColumn(name = "product_id")
   ...

2. Still getting outofmemory errors with batch processing    forum.hibernate.org

int rows = 0; try{ Session as400Session = getHibernateTemplate().getSessionFactory().openSession(); //set up the ora session HibernateDaoSupport dao = (HibernateDaoSupport) getOraDao(); oraSession = dao.getSessionFactory().openSession(); oraTx = oraSession.beginTransaction(); //delete the vins int rowsDeleted = getOraDao().deleteVins(oraSession); log.info("There were " + rowsDeleted + " vins deleted"); //set up the as400 transaction as400Tx = as400Session.beginTransaction(); rows = ( (Integer) as400Session.iterate("select count(*) from As400Vin").next() ).intValue(); Query q = ...

3. Batch Blob Insert OutOfMemory    forum.hibernate.org