Bullet Physics?

Status
Not open for further replies.

NoahDVS

Deepscanner
Jul 27, 2016
182
133
43
#21
Is it possible to decentralize some of the calculations? In essence, use players clients as botnets. For example, an invasion with 100 players and 400 mobs. If each player simulates 4 mobs then that would take a huge burden from the server and reduce server bandwidth.

A possible check against hackers would be to let 2 or 3 other players calculate the outcome for the same mob and if one of them gives a different response then he is found out.

Another check would be to spawn a ghost mod from time to time. A mob that is indistinguishable from other mobs but is invisible to the player. Both the client and the server will simultaneously calculate the response and if they aren't the same then the client has been fiddling with AI code.

With lag and stuff you'll likely have false negatives. An 80% success threshold would probably be good. When multiple people simulate the same mob then the fastest to calculate a response and send it to the server for synchronization will prevail. The other responses serve as a check.

Thoughts?
I think latency would be a massive problem with that system. I'm of the opinion that we don't need to reinvent the wheel or make assumptions about what compromises will have to be made. We know nothing of the hardware that will be used. AI may work much better in UE4. We're more likely to not ever be able to play Ember than we are to not have enough server power at this point in time.
 
Jul 27, 2016
84
65
18
#22
I think latency would be a massive problem with that system.
Unless you ignore everyone with a long latency. Not absolutely everyone has to participate in it. People living closest to the servers are more favored.

A latency independent check would go as follow:
You could periodically send a situation snapshot to the server. Then the server transfers the situation snapshot to a secondary computer that calculates the mob response. If the secondary computer response and your response are the same then everything is fine. This method checks the validity of the client after the fact.
 

NoahDVS

Deepscanner
Jul 27, 2016
182
133
43
#23
Unless you ignore everyone with a long latency. Not absolutely everyone has to participate in it. People living closest to the servers are more favored.

A latency independent check would go as follow:
You could periodically send a situation snapshot to the server. Then the server transfers the situation snapshot to a secondary computer that calculates the mob response. If the secondary computer response and your response are the same then everything is fine. This method checks the validity of the client after the fact.
A latency of 20ms (assuming everyone is that low) is still not good if the server has to send something to someone else's client (20ms), back to the server (+ 20ms) and over to me (+ 20ms = 60ms). Now what if someone has a fairly average latency such as 50ms? What if my computer starts slowing down from a hungry background process after the server tells my computer to start computing for it?
 
Last edited:
Jul 27, 2016
84
65
18
#24
A latency of 20ms (assuming everyone is that low) is still not good if the server has to send something to someone else's client (20ms), back to the server (+ 20ms) and over to me (+ 20ms = 60ms). Now what if someone has a fairly average latency such as 50ms? What if my computer starts slowing down from a hungry background process after the server tells my computer to start computing for it?
Yes that is true, Latency would crap out the whole system.:(

Then again, doing AI path finding client side (while shooting happens server side) wouldn't be all that bad. in worst case scenario a 0.2 second delay on searching cover is probably unnoticeable and allows for more complex AIs.

I'm of the opinion that we don't need to reinvent the wheel
Yes, but the wheel has been improved over the course of history. Fixing stuff that isn't broken is about wanting to keep on improving stuff.

UE4 is better then the offset engine, no doubt about that but if we are able to utilize the processing power of the player then we can cram so much more of the good stuff into the game and make it much more enjoyable.
 
Likes: DARKB1KE

NoahDVS

Deepscanner
Jul 27, 2016
182
133
43
#25
Yes that is true, Latency would crap out the whole system.:(

Then again, doing AI path finding client side (while shooting happens server side) wouldn't be all that bad. in worst case scenario a 0.2 second delay on searching cover is probably unnoticeable and allows for more complex AIs.


Yes, but the wheel has been improved over the course of history. Fixing stuff that isn't broken is about wanting to keep on improving stuff.

UE4 is better then the offset engine, no doubt about that but if we are able to utilize the processing power of the player then we can cram so much more of the good stuff into the game and make it much more enjoyable.
Can pathfinding actually be separated and made asynchronous with the rest of the AI? I know that developers often have trouble taking advantage of multithreading with AI because things need to be done in order and sending some tasks to the client is similar to multithreading. If they already have a way to process parts of the AI separately, then that's great. If they didn't before and found a way, why not just take advantage of the servers cores and eliminate the need for external processing? Also, 0.2 seconds may not seem like much in writing, but the AI won't be challenging if it reacts too slowly.
 
Jul 28, 2016
31
17
8
#26
Can pathfinding actually be separated and made asynchronous with the rest of the AI? I know that developers often have trouble taking advantage of multithreading with AI because things need to be done in order and sending some tasks to the client is similar to multithreading. If they already have a way to process parts of the AI separately, then that's great. If they didn't before and found a way, why not just take advantage of the servers cores and eliminate the need for external processing? Also, 0.2 seconds may not seem like much in writing, but the AI won't be challenging if it reacts too slowly.
They can use different approaches to AI to make it run faster. For example: different algorithms: one with jumping and thrusting, one walking , one shooting, one taking account of the state of the player (e.g. flying, kinetic ability is up) or even step away from the navmesh pathfinding and just use some other method in combination with everything I stated above, resulting to some good async computations :)
 
Jul 28, 2016
31
17
8
#28
Is it possible to decentralize some of the calculations? In essence, use players clients as botnets. For example, an invasion with 100 players and 400 mobs. If each player simulates 4 mobs then that would take a huge burden from the server and reduce server bandwidth.

A possible check against hackers would be to let 2 or 3 other players calculate the outcome for the same mob and if one of them gives a different response then he is found out.

Another check would be to spawn a ghost mod from time to time. A mob that is indistinguishable from other mobs but is invisible to the player. Both the client and the server will simultaneously calculate the response and if they aren't the same then the client has been fiddling with AI code.

With lag and stuff you'll likely have false negatives. An 80% success threshold would probably be good. When multiple people simulate the same mob then the fastest to calculate a response and send it to the server for synchronization will prevail. The other responses serve as a check.

Thoughts?
requires too much effort and fiddling nonsense with the engine to make this work; it also allows cheating easily. Just no. Everything should be done server side. If you are that much worried about performance just use some GRID gpu and computation shaders (see also GPGPU)
 
Likes: phoenix

EvilKitten

Well-Known Member
Ark Liege
Jul 26, 2016
777
1,557
93
#29
What you are not looking at is that for computer cycles even 20ms is a very long time. consider a 3 ghz single core processor with one pipe. It can do up to 60 million calculations in 20ms, assuming your lag is actually 20ms which is going to be iffy, and of course you have to also send it back. There is a reason why CPU/GPU's have their own internal memory even outside of system. It's way way faster than even locally on a board, let alone to another computer hundreds or thousands of miles away.
 

Ctzn_no7

New Member
Dec 20, 2016
1
0
1
#30
requires too much effort and fiddling nonsense with the engine to make this work; it also allows cheating easily. Just no. Everything should be done server side. If you are that much worried about performance just use some GRID gpu and computation shaders (see also GPGPU)
Actually it's not totally far fetched idea.

UE4 Assigns controllers for each character(AI or player)
In Scenario of open world game, you could assign each connected client to handle about 5-10 AI units easily and you would never notice the performance hit (performance hits kick in around 500 advanced AI units)

Now, to prevent cheating these could be AI units that are _not interacting_ with the controlling client, which would prevent gaining benefit from cheating(unless 10-50 players are doing it simultaneously, which is unlikely to happen)

And more to it, shuffling around the controlling clients is pretty easy by third party, matter if simply using [possess] or [unpossess] functions which have no performance effect, so in Theory you could keep switching the AI "owners" constantly(every 5 sec or so) to further remove any benefits cheating could bring in.

Downside is AI could possibly "fall asleep" in the middle of the fight because client alt-tabbed out or lagged/disconnected until it gets a new master brain.
 
#31
Downside is AI could possibly "fall asleep" in the middle of the fight because client alt-tabbed out or lagged/disconnected until it gets a new master brain.
for that case you can make a "dazzled" animation where the enemy just looses the overlook of the situation or something else. You could also make that to stay AI "owner" you got to send packets in under 75ms or the AI gets switched to another client until the next switch is in again. (that would prevent some "laggy" enemies that got a bad reaction time and timeout enemies)
 
Status
Not open for further replies.