Question

I have searched for an answer, but I didn't find anything that could answer 100% my question. :D
So, I need to create from an hashmap other 3 new hasmaps, exchanging 2 of its values in 3 different ways (I use map2.putAll(map1) and later switch 2 values), later I have to store the hashmaps in a ordered priorityqueque, and later poll one of them and repeat the process.

The problem starts since I have to do this A LOT of times, since a special condition isn't satisfied...and there's no way to reduce the number of maps I do (well, I may block duped ones, but it would be very expensive i think :/ ).

Actually, I don't really mind if I store all the maps somewere (from what I read, I figured out that's GC fault) or if I have to do other tricks... I just want to get my code working, without (if possible) using stuff like that '-XX:-UseGCOverheadLimit' because it's a sort of homework (even if with no mark), and it must work with 'java classname' from prompt.

The only limit is on time (which is not a problem, 100% sure about it, already tested), and it will be tested with just simple inputs (in which is already good), but I'd like to made it able to do even difficult ones.

I thank you in advice.

Oh, the maps are of this kind: <Integer,Integer> and have size of 16. Unfortunately, I can't post the code here since the homework is not ended yet (even if I submitted mine).

Was it helpful?

Solution

It's hard to judge from the description what happens in your code exactly, but if the memory overhead of allocating and trashing maps is what's killing you you can just cache Map instances and reuse them. Use some object pooling implementation, there's loads around, or just use a Stack as an object pool if there's just one thread.

The other issue you have is using Integer will cause massive memory churn, and as Integer is immutable, that's a bit of a pain. Replace your Map with the Trove int/int map and you're sorted :)

OTHER TIPS

HashMaps are expensive in terms of memory: only use them when you require quick searches from a large dataset. Having " and have size of 16" doesn't look like you need Hashmaps at all.

It might help if you could give us some more specifics about the task that was given to you... But let me share my thoughts on this.

First of all, there shouldn't be too much wrong with using Integer (the class) versus int (the primitive). It does use more memory, but hey it's still only integers. You should not run into any issues unless you have a huge truckload of them. Same for HashMap. Trust the community. If there were serious issues with HashMap we would have found out by now.

It sounds to me like you have a memory leak. Check your program for that. You say you don't want to up the memory limit (wise decision if you ask me), but you might try this just to test. If you do have a memory leak, your program will just run for slightly longer before giving you the exact same problem. There are specialized tools for detecting memory leaks, but just using the standard Windows process manager and keeping an eye on your program's memory consumption over time can already quickly reveal if you have a leak. If your program is doing the same thing over and over, you should expect memory consumption to be relatively stable. It will go up and down over time as memory is allocated and collected by the garbage collector again, but this should stay within a pretty stable bandwidth.

If you are convinced you do not have a memory leak, you are probably solving the problem in the wrong way. For example a naive implementation to process a file might read the file into memory, process it and then save it back again... But if the file is 1GB large then that approach will consume 1GB of memory just to store the loaded data. If it needs to be able to handle large amounts of data, don't process it in one go, but work in chunks. Read part of the file data into a buffer, process it, get rid of the processed data and read the next chunck. This way your application can handle files of any size. It's just an example, but this kind of stuff is typical for memory issues. Either you have a leak, or your approach is wrong. Only programs that are HUGE and work with lots and lots of data should ever reach the memory limit. Most programs are not like that. They just either have memory leaks or are using a wrong way to approach the problem.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top