This is way too easy. The chat is so organized. :-) They will be done in about: 30 minutes times the input delay.
The original twitchplayspokemon took about a month if I remember correctly and it worked because nobody had a concrete plan, unlike here. Their first voting system was very chaotic with an input selected from the chat every x seconds instead of a voting consensus. This time as long as the chat contains at least 50% of voters that vote correctly they will win without mistakes.
A less boring twitchplays would be just a virtual machine with linux and some abstract/vague goal.
Because we're supposed to have speedy trials. The state should have the right to hold suspects until trial, because otherwise you'd be letting potential criminals run free (and run away).
That algorithm is only used to create an input pattern. When it is fed back to the same sorting function, it will cause it to go quadratic. If your target is using the same sorting function it will do the same on their side.
This attack is realistic. I have tried it on the glibc qsort, generic quicksort, and Visual Studio qsort, and they were vulnerable.
I'm interested in the way you use the word "attack."
I was looking at it from the perspective of you could never run into an input that would consistently sort in O(n^2) using randomized-quicksort.
But your perspective (maybe this article's perspective) is that some malicious adversary could force the sorter to do extra work by knowing in advance the steps that they will take. Am I getting that right? Like from a flops security stand point?
If so, why care about this situation? This only seems like a realistic scenario in very weird domains, like amazon web services messing with everyone's calls to some standard sorting code to charge them more cents for compute time or something...
If I'm not mistaken, the attack is aimed at a quicksort implementation where the pivot value is chosen as a median of first, middle, and last values. It would not work on randomized quicksort, of course.
Autopilot disengaged( with a loud beep and visual ) and the driver didn't have his hands on the wheel like he is supposed to. Consequentially the car started to drift left. In addition autopilot can only be used on a highway.
Several mistakes made by driver, so the blame is on him.