Pergunta

We know that typical operating systems and high level languages (especially those with garbage collection) cannot be used for real time operating systems. Java & jet engines don't mix, and consequently a lot of study has gone into researching and developing real time operating systems.

Is there any research as to exactly why a traditional operating system and high level garbage collecting language can't be used? I'm interested in any stochastic modelling that might have been done, rather than lots of words. So as a simple example, is there a mathematical model for the execution rate of say a simple FOR /NEXT loop on something like *nix + Java? Ideally this should refer to contemporary micro processors with all their associated speculative execution, interrupt handling and other modern optimisations like memory reclamation and parallel processing.

I would also extend this reference request to any models that might try to equate the mensuration of the execution rate of such a FOR /NEXT loop, with the Observer Effect as found in quantum mechanics.

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
Não afiliado a cs.stackexchange
scroll top