Fake Scott Greenfield
Generated by Doctor Nerve's Markov Page, which "allows the writer to type in prose or poetry, and submit it to a Markov Chain engine. This engine munches through the writer's text, performs a statistical analysis, and spits out statistically similar text":
Proposition 6 in one capacity eventually end the retirement age, aside from not just after the courts. It's bad for the notion that judges, and were brilliant later. If they don't seek to decide the notion that there are at their game." So I voted in their clerks? Whether a margin of the bench, old people become painfully clear. And anyone a judge is based solely on what sitting in the age for the first place, or turned sour, lazy or turned sour, lazy back when, they are really don't see it. It's bad for the New York election proposed to 39%. Chief Judge Lippman said in their failing competence exposed? This changes the bench, old people we do their first job of a burden. We shouldn't congratulate someone upon being a while can fill their jokes. If they don't kiss ass enough or turned sour, lazy back when, they were like before the problem is that people become judges.
I recall reading some thirty years ago (probably in Scientific American) about a program called Racter (not, I think, this Racter), which took a body of literature, computed the frequency of each letter following each other letter, and then generated likely-sounding text.
So, for example, if "_th" is followed by "y" 1% of the time, "a" 20% of the time, "e" 40% of the time, "r" 5% of the time and "i" 34% of the time, then when the last three characters that Racter had generated were "_th" the next character would be one of those, with probability in proportion to their existence in the body of literature. It may have acted more than three orders deep-it may chosen the next character based on the previous four, or five or more rather than just three.
This is, as I understand it, a "Markov Chain." Dr. Nick's Markov Page is a word-based Markov Chain generator (generating random sentences from given words). Wlonk.com has a character-based Markov Chain generator, into which you can input your text, and which allows you to select the level ("depth" or "window size") of the chains.
Here's Wlonk's version of Greenfield, with a level of 1:
Tj ealneoetupe rd i tkidnn ptC.wtt rb snh lt bTtl yiglt oh ti s7dlie py nietce Umotb inodnlroediuSdhineeorIrosl ir v egclyupeondli o opm T tu rho ef o rBown cny,heo lo iwttot dugded.cnuakeeh a vf tildoir e.o nsdafgtp ctaoteaeuryee etirnee scehhep arspietahl ' dvleImfe,eb.sspn trtcdrv ltav otuide hdnp etyinnihcdeete o l g oobhs tltso e oieejtoeteoacnirktomatttsseytrodmseseulaYrhrlothhqoljtjei eoit leo Yiot oubvgsloglmainnmeoh. mrk,p istud a ehoako l hoi.deooo ogydteyhge
And here's Wlonk's Fake Greenfield Level 6:
The proposition on that's a responsibility, a burden on the needs fixing, but give only retirement age and dirtied their prime, it's bad for a while I voted in an interview. "To me, is undertaken. It lost. Well, not making that's an anachronism in them so their prime, is undeniable. Angry Blond lawyers know too long? Are judges who has deprived to laugh at the propositions with the proposition from worthless organizations to have a constitutional presumption for someone upon being a judges to you wo
The more text you have, the higher the level you can use to generate random text without just spewing things that have already been written.
With everything that has ever been written available on the Internet it seems that the time is ripe for a database that would allow the generation of high-order Markov Chains for all sorts of entertaining results. A complete-works-of-William-Shakespeare database, a KJV database, a Conan Doyle database, a Cervantes database, a Scott Greenfield database, and various combinations.
Does such a thing exist?
If you can't point me toward one, I may have to commission it.