Curious as to how to about properly calculating the server's load? The way I did it (Which I don't believe was working correctly) was the following.
Code:
scheduler.schedule(new Task() {
@Override
protected void execute() {
long startTime = System.currentTimeMillis();
itemHandler.process();
playerHandler.process();
npcHandler.process();
shopHandler.process();
CycleEventHandler.getSingleton().process();
objectManager.process();
fightPits.process();
pestControl.process();
ticks++;
long endTime = System.currentTimeMillis();
System.out.print("\n[Debug]: Server usage at ["+(int)(((endTime - startTime) / 600) * 100) + "%]"
+ "\n[Debug]: Started at : " + startTime
+ "\n[Debug]: Ended at : " + endTime);
}
});
I'm not sure if I'm doing this correctly, but it seems that the startTime and the endTime are always equal to one another, and I doubt the entire server is processing in less than 1ms.
Considering startTime and endTime are always equal, the calculation in the end is equal to 0, displaying 0% server usage.
Help?