feat: inference.py is now standalone and can run from local server or local docker build or the hf space and can auto detect which is available and run on that also has the hard fallback to run on the hf space
main logic complete, inference.py running as expected, now fine tuning the reward functions and scoring to make complete sense and also check openenv spec complaince completely