- Joined
- Jan 24, 2016
- Messages
- 49
- Reaction score
- 20
While I was working on this atrocity
I noticed a huge problem with logic: it lacks sequentivity (I am not sure what word should be used here). The problem is that it's impossible to reliably predict how a circuit will work during lag spikes.
You see, sometimes logic gates send signal even if last signal wasn't sent. It's kinda hard to explain so let me show you instead.
Here I simulate lag-free environment:
I manually press a button to make fourth digit 3 and it works perfectly.
It will never break even if you try hard enough. Why it works perfectly? Because manual input makes that system not time-critical.
Now I try to activate that clock. It basically means a button will be pressed each second.
Now it produces nonsense:
Why does it happen? Because first part of the clock is pretty complex and big and despite it being not time-critical it makes the whole clock time-critical due to lack of sequencing. It never waits for signal to be proccessed, it just sends new signal. While it's fine for lag-free situations, it makes a complete mess when it's lagging and in extreme cases even DELAYing might fail.
That basically makes complex logic circuits extremely unreliable and there is no way to counteract that.
What I suggest is making sequentional logic by default. What does it mean?
1. A queue of receivers forms when gate's state changes. The queue will consist of receivers in order as they were connected. So if button 1 was connected to flip-flop and button 2, the queue will be "flip-flop, button 2"
2. Each signal will be sent to a receiver as stated in the queue. The receiver must send a virtual signal of receiving the signal to the sender.
3. The gate will not continue if it didn't get the signal from receiver. Of course if the receiver does not exist anymore, it's ignored. Also it will be ignored if the link was broken while the sender waited for the answer.
4. It will go on to the next gate when the queue is empty.
By doing that, most of logic systems will be impervious to lagging.
I noticed a huge problem with logic: it lacks sequentivity (I am not sure what word should be used here). The problem is that it's impossible to reliably predict how a circuit will work during lag spikes.
You see, sometimes logic gates send signal even if last signal wasn't sent. It's kinda hard to explain so let me show you instead.
Here I simulate lag-free environment:
I manually press a button to make fourth digit 3 and it works perfectly.
It will never break even if you try hard enough. Why it works perfectly? Because manual input makes that system not time-critical.
Now I try to activate that clock. It basically means a button will be pressed each second.
Now it produces nonsense:
Why does it happen? Because first part of the clock is pretty complex and big and despite it being not time-critical it makes the whole clock time-critical due to lack of sequencing. It never waits for signal to be proccessed, it just sends new signal. While it's fine for lag-free situations, it makes a complete mess when it's lagging and in extreme cases even DELAYing might fail.
That basically makes complex logic circuits extremely unreliable and there is no way to counteract that.
What I suggest is making sequentional logic by default. What does it mean?
1. A queue of receivers forms when gate's state changes. The queue will consist of receivers in order as they were connected. So if button 1 was connected to flip-flop and button 2, the queue will be "flip-flop, button 2"
2. Each signal will be sent to a receiver as stated in the queue. The receiver must send a virtual signal of receiving the signal to the sender.
3. The gate will not continue if it didn't get the signal from receiver. Of course if the receiver does not exist anymore, it's ignored. Also it will be ignored if the link was broken while the sender waited for the answer.
4. It will go on to the next gate when the queue is empty.
By doing that, most of logic systems will be impervious to lagging.