A name for this?
Isn't it ironic, dontcha think?
A name for this?
Isn't it ironic, dontcha think?
I've been concerned about AI as x risk for years before big tech had a word to say on the matter. It is both possible for it to be a threat, and for large companies to be trying to take advantage of that.
But not a Venus flytrap. Or a pitcher plant. Or rafflesia. Or...
You don't think nearly 1/6th is statistically significant? What's the lower bound on significance as you see things?
To be clear, it's obviously dumb for their generative system to be overrepresenting turbans like this, although it's likely to be a bias in the inputs rather than something the system came up with itself, I just think that 5% is generally enough to be considered significant and calling three times that not significant confuses me.
What's worse is that it's not evenly distributed across the set of young talent. The most capable, most impressive, most outstanding talent, is going to have the most options, and thus are most likely to go. It isn't just a halving of the upcoming workforce, it's a lessening of the average quality of that reduced force.
Not my content; the article has a vexing system of gatekeeping, so I reposted it here.
Not my content; the article has a vexing system of gatekeeping, so I reposted it here.
"Quiet quitting? Oh, do you mean acting my wage?"
It's not an apology if you keep trucking right the fuck on along. You apologize, in part, by fixing the problem. Absent that, it's just empty words. Meaningless.
1.7, 1.6, 1.5.....
I'm well aware, but we don't get to build an AGI and then figure it out, and we can't keep these ones on target, see any number of "funny" errors people posted, up to the paper I can't recall the name of offhand that had all of the examples of even simpler systems being misaligned.