Question

I have a qt gui that spawns a c++11 clang server in osx 10.8 xcode

It does a cryptographic proof-of-work mining of a name (single mining thread)

when i click .app process takes 4 1/2 hours

when i run the exact exe inside the .app folder, from the terminal, process takes 30 minutes

question, how do i debug this?

thank you

====================================

even worse:

mining server running in terminal.

if i start GUI program that connect to server and just sends (ipc) it the "mine" command: 4 hours

if I start a CL-UI that connects to server and just sends (ipc) it the "mine" command: 30 minutes

both cases the server is mining in a tight loop. corrupt memory? single CPU is at 100%, as it should be.. cant figure it out.

========= this variable is is used w/o locking...

volatile bool running = true;        

server thread

 fut = std::async(&Commissioner::generateName, &comish, name, m_priv.get_public_key() );

server loop...

nonce_t reset = std::numeric_limits<nonce_t>::max()-1000;
while ( running && hit < target ) {
    if ( nt.nonce >= reset   ) 
    {
        nt.utc_sec = fc::time_point::now();
        nt.nonce = 0;
    }
    else { ++nt.nonce; }

    hit = difficulty(nt.id());
}
Was it helpful?

Solution

evidence is now pointing to deterministic chaotic behavior. just very sensitive to initial conditions.

initial condition may be the timestamp data within the object that is hashed during mining.

mods please close.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top