1. How to handle large dataset with JPA (or at least with Hibernate)? stackoverflow.comI need to make my web-app work with really huge datasets. At the moment I get either OutOfMemoryException or output which is being generated 1-2 minutes. Let's put it simple and suppose ... |
2. Cascade delete performance drop on bigger datasets, can this be caused by lack of indexing? stackoverflow.comI'm writing some code that has to cascade delete records in a certain database, and I noticed a drop in performance as the database has more records in it. When I ... |
3. Large SQL dataset query using java stackoverflow.comI have the following configuration:
|
4. Performance degradation second level cache for large dataset forum.hibernate.orgSince your strategy is read-only, I'll assume that the data is immutable within the application? Some things to look for: 1.) even though you've enabled the query cache, each HQL query must set cachable = true: Code: Query query = session.createQuery("from CRONumberDetailsResult as phone_details where phone_details.slotId = :slotId"); query.setCachable(true); If you don't set cachable = true, you'll always be hitting the ... |