I think the reason so many LOTR ripoffs fail is because they make their Aragorn analogue the main character, when the entire point of Aragorn is that he’s “the person the villains think is the main character, but is Not.”
Aragorn seems like a traditional King Arthur style hero— he has huge Main Character Energy because he’s supported by destiny, by bloodline, by all these magic artifacts and prophecies, and etc etc. Frodo and Sam are Just Some Guys. Aragorn recognizes that Sauron understandably thinks he’s the main hero of this story ….and he pretends to believe it too, spending the entire series using himself as a diversion to prevent Sauron from seeing Frodo and Sam.
Aragorn’s whole thing is that knows he seems like the Main Hero of this legend to people who don’t know better —- but he also knows that he isn’t, and that his role is just to keep Sauron’s eye on him in order to protect the people around him.
And it works! Sauron is so fixated on defeating his Legendary Destined Archenemy with Extreme Main Character Energy that he completely overlooks the two ordinary little guys who were the real threat to him all along.
(via epimeral)
This is not an exaggeration. Your download speed would slow down to the point where Windows would make this kind of absurd estimate, and you’d sigh and leave the room for a while (because you couldn’t use the computer while it was doing this for fear it would crash and lose all your progress) and then you’d come back in 40 minutes and maybe it would now say 52 years or maybe it would say 3 minutes, who knew, not Windows.
(via themischiefoftad)
people saying “I asked chatgpt a question and I assumed its response was correct” is to me a thousand times more scarier application of AI than whatever the effective altruist weirdos are scared of
I work in a bio research lab and we were fucking around with it for fun a little while back. We decided to ask it one of the questions that one of the grad students was working on and incredibly it spit out a logically coherent answer and cited working links to real publications.
The catch is, the linked publications were completely unrelated articles from open-source journals since chatgpt can’t access papers behind paywalls, which is a lot of papers. Furthermore, what it was saying was horseshit. It sounded so vaguely convincing that we had to show it to the aforementioned grad student, who confirmed it was nonsense. The output was pretty similar to how something would be presented in a normal paper and referenced real molecules in tangentially coherent ways, but again, horseshit.
We all work in the field and we still had to confirm that how it answered the question was incorrect. If someone who wasn’t in the field read it and saw that the links were real papers but didn’t actually check if they were relevant, they might have been convinced. The question we asked was ultimately benign but thinking about the obvious potential misuses if people aren’t careful definitely made messing with it less fun after that. Shit’s scary.
(via gravityeyelids)
girl.. i saw you shrieking in the middle of the forest to summon terrifying creatures. can i get your number
(via wulcanbiology)
The “these two things are not related” at the end absolutely elevates this to god tier
abSOLUTELY NO
(via thegayraptor)
Absolutely bonkers that I’m now one of those weirdos you hear about on Twitter
I committed to the bit so hard that I also committed misdemeanor impersonation of a government official
(via allieisnothere)
What the fuck
This is absolutely fascinating. I’ve now been looking at Alex Colville’s paintings and trying to work out what it is about them that makes them look like CGI and how/why he did that in a world where CGI didn’t exist yet. Here’s what I’ve got so far:
- Total lack of atmospheric perspective (things don’t fade into the distance)
- Very realistic shading but no or only very faint shadows cast by ambient light.
- Limited interaction between objects and environment (shadows, ripples etc)
- Flat textures and consistent lighting used for backgrounds that would usually show a lot of variation in lighting, colour and texture
- Bodies apparently modelled piece by piece rather than drawn from life, and in a very stiff way so that the bodies show the pose but don’t communicate the body language that would usually go with it. They look like dolls.
- Odd composition that cuts off parts that would usually be considered important (like the person’s head in the snowy driving scene)
- Very precise drawing of structures and perspective combined with all the simplistic elements I’ve already listed. In other words, details in the “wrong” places.
What’s fascinating about this is that in early or bad CGI, these things come from the fact that the machine is modelling very precisely the shapes and perspectives and colours, but missing out on some parts that are difficult to render (shadows, atmospheric perspective) and being completely unable to pose bodies in such a way as to convey emotion or body language.
But Colville wasn’t a computer, so he did these same things *on purpose*. For some reason he was *aiming* for that precise-but-all-wrong look. I mean, mission accomplished! The question in my mind is, did he do this because he was trying to make the pictures unsettling and alienating, or because in some way, this was how he actually saw the world?
(via sierscarf)
bebx:
When you’re scrolling through AO3 for some fanfics to read, what’s the MOST IMPORTANT FACTOR you use to decide whether or not you’ll read something?
The summary
Word count
Rating
Tags
The fic title
Pairing / characters
Feedback; how many kudos / hits it has
How many chapter(s) it has, if it’s completed (if not; then Last Update date)
The authors
**Other (please put in comments or tags)**
See ResultsReblog after voting for bigger sample size is highly appreciated :)
(via trensu)
























