lvxferre

joined 3 years ago
[–] [email protected] 24 points 11 months ago* (last edited 11 months ago) (3 children)

I love how it pokes fun at an ackshyually, and then proposes a monophyletic clade for arseholes.

...at the end of the day herpetology studies tetrapods minus the ones that ornithologists and mammalogists called dibs on. You'll see the same in medicine - vets treating all animals, except the species that physicians said "NOPE, I GOT THIS ONE".

[–] [email protected] 2 points 11 months ago (1 children)

Together, they make up a bit more than 50% of active users, yet basically all far-right trolls are from there.

That doesn't say much about LW besides being the biggest instance - because trolls beeline towards larger audiences.

but if you have more users to moderate, you should also increase moderating capacity or close registrations.

Closing registrations might be the sensible approach here. Because the necessary moderation grows exponentially, and eventually too large of a mod/admin team becomes a problem on its own.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago) (2 children)

Then again you’re talking about Beehaw, their users react so badly to anyone telling them they might be wrong that it’s not surprising their mods need to spend a disproportionate amount of time taking action against other users.

I think that you're partially right.

The sort of online fight that Beehaw seeks to avoid depends on having at least one dumb fuck¹. Two cause it faster, but one is enough - because smart people have an easier time reaching agreement or realising that no agreement is possible.

Both sides (LW+SJW and Beehaw) have their shares of smart posters and dumb fucks². The later are there for different reasons:

  • LW+SJW absorbed the bulk of the Reddit diaspora, and Reddit culture revolves around being a dumb fuck;
  • Beehaw doesn't prevent you from being a dumb fuck, as long as you are not the one apparently starting the fight. Effectively sheltering dumb fucks who are really good at passive aggression, and who'll have a meltdown once you say "what you said is incorrect, here's why", as they'll interpret this as a personal attack.

If you're a smart user, you eventually learn how to handle the dumb fucks in your instance, in a way that is allowed there: chewing them out (within limits) or avoiding them like a plague. But those approaches break once you're handling someone from the other side:

  • smart Beehaw user interacting with LW+SJW dumb fuck: "I feel like I'm always swallowing frogs with those users, as they say stupid shit and I don't want to be rude."
  • smart LW+SJW interacting with Beehaw dumb fuck: insert here your first paragraph. It's why I think that you're partially right.

notes

  1. For the sake of this comment, I'm defining "dumb fuck" as someone who assumes too much, oversimplifies, disregards context, focuses too much on who says it instead of what is being said, lacks basic understanding of what other users say, or a mix of those.
  2. Note that, while it's useful to pretend that "dumb fuck" and "smart user" are different categories of people, they are not - those are different categories of user behaviour, i.e. the same person could be theoretically a dumb fuck in some situations and a smart user in another.
[–] [email protected] 3 points 11 months ago

Typically 1~10. Four right now (all four are Lemmy: inbox, another thread, front page, this thread)

I close them as the tabs bar feels cluttered and/or I see no reason to keep them open.

[–] [email protected] 5 points 11 months ago (1 children)

Actually interesting video! I'm clueless when it comes to fonts, but a few comments about the start (when he gives it some historical background):

What the Phoenicians did was to take a look at the hieroglyphs like, "Yeah! Love that! But... what if we made the symbols even more abstract?"

I know, he's oversimplifying it (as it is not the focus of the video), but it's worth noting that this abstraction was done by the Egyptians themselves, while writing hieroglyphs. Hieroglyphs often use something called the "rebus principle", where you represent a word by a similar-sounding word. For example, "son of" /sa/ was often represented with ⟨𓅬⟩, a white-fronted goose /za/ - because they sound practically the same.

(It's a lot like writing "I like The Beatles" as ⟨👁️👍🪲🪲⟩. Why ⟨👁️⟩? Because it sounds the same as "I".)

What the Canaanites (including the Phoenicians) did was to use this rebus principle in a more consistent way, and only for the first consonant of the word. For example, ⟨𓉐⟩ (a house) representing [b] because "house" in those languages usually starts with that sound. That's the start of the phonetic principle (graphemes represent sounds instead of concepts).

There's yet another level of abstraction, that it's hard to pinpoint when started to become relevant: instead of representing the "raw" sounds, you represent the underlying phonemes. It's the reason, for example, that the /p/ in ⟨pit⟩ [pʰ] and ⟨spit⟩ [p] gets the same letter - because although they sound different, they're still the same phoneme.

Now, there were a couple of problems with this early alphabet from the Greeks, there only had uppercase, and while they wrote in rows, sometimes they wrote LTR, sometimes RTL

Ah, come on, that's silly - neither is a "problem" of the lapidary early Greek alphabet. It's just the absence of a feature that he's used to, and the presence of another.

For comparison: this is on the same level as an Arabic or Farsi speaker saying "now, there are a couple problems with the modern Latin alphabet, such as lack of initial/medial/final forms, and writing the vowels with their own letters as if they were consonants."

Enter the Romans...

Further info on the alphabet. Be warned that it's mostly trivia.

  • ⟨G⟩ is a later innovation, more specifically from 230 BCE. Originally the Roman alphabet used ⟨C⟩ for both /k/ and /g/.
  • Including ⟨J⟩ was a mistake - it was not a letter back then, it originated as a curled ⟨I⟩ in the middle ages. ⟨I⟩ and ⟨J⟩ got "split" into their own letters rather recently.
  • ⟨U⟩ was not a letter back then either, but he got it right. Same deal with the above, except between ⟨V⟩ and ⟨U⟩.
  • ⟨K⟩ was only marginally used. You do see it popping up for native Latin words, but it's usually for Greek borrowings. Specially after Latin shifted /k/ to sound like [tʃ] (as in chill) before front vowels.
  • ⟨Y⟩ was mostly used for Greek borrowings, representing the sounds [ʏ y:] (as in German Müller and über). Latin itself lacked the sound, and odds are that most speakers butchered those words to sound like [ɪ i:] (as in bit and beet) instead.
  • ⟨W⟩ is not there because, although ⟨V⟩ represented three sounds in Latin, [w ʊ u:] (as in wool, book and boot), confusing [w] with [ʊ] was not a big deal (more on [u:] later). It wrecked havoc for Germanic dialects though, so they started representing the consonant with a digraph, ⟨VV⟩.
  • ⟨Z⟩ used to be the sixth letter of the alphabet. Then it was kicked off the alphabet for being "too foreign". Then it came back at the end.
  • Some Roman emperor tried to introduce three letters into the alphabet: ⟨Ↄ Ⅎ Ⱶ⟩, that were supposed to represent [ps w ɨ] (as in cops, wool, and Polish byt). They were mostly forgotten.
  • The Romans used a diacritic, to represent vowel length, the apex. For most time it looked like its descendant (the modern acute), except over ⟨I⟩ - because then people wrote a longer ⟨I⟩ instead.
[–] [email protected] 15 points 11 months ago* (last edited 11 months ago)

It's inference based on mouth movements, but it isn't as rough as it seems like - context plays a huge role on disambiguation, just like it would for you with homonyms that you hear. It's just that the number of words that look similar when you mouth read is larger than the number of words that sound the same, since some sounds are distinguished by articulations that you can't immediately see (such as [f] vs. [v] - you won't see the vocal folds vibrating for the later, so "fine" and "vine" look almost* the same.)

Also, the McGurk effect hints that everyone uses a bit of lip reading, on an unconscious level; it's just that for most of us [users of spoken languages] it's just to disambiguate/reinforce the acoustic signal.

*still not identical - for [v] the teeth will touch the bottom lip a bit longer.

[–] [email protected] 6 points 11 months ago

Pros: no fanning/bellowing
Cons: needs a stool to reach the chimney

...this channel is a treat. He made a natural draft furnace 6y ago, and with this one the improvements are visible - addressing short-circuit drafts, stronger airflow due to parallel tuyeres, and apparently far more yield. (Back then he got no iron from the slag around the tuyere.)

[–] [email protected] 2 points 11 months ago

Driving safely and smart is essential for other reasons, it does prevent additional bottlenecks (you mentioned one, wreckages), and it reduces the impact of the unavoidable bottlenecks (because the cars won't waste so much time re-accelerating after them). But if my reasoning is correct, most of the time there isn't much that drivers can do against traffic besides "don't use the car".

[–] [email protected] 3 points 11 months ago

I believe that it should still reduce the likelihood of spam calls - because you don't advertise where you aren't selling stuff, and if you're selling stuff you don't want to piss off the local government.

For reference: where I live the "do not call" list is from the state. Most of those spam calls come from people in other states controlled by the same republic, thus not subjected to the rules of my state - and yet the "do not call" list still does its job.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago)

I get its basic shit that’s over my head.

It's over the head of everyone. That's why I shared it here.

Likewise would 0.888… be .9?

No, but 0.899... = 0.9. This only applies to the repeating sequences of the last digit of your base. We're using base 10 so it got to be 9.

If I have 100 dogs, and I split them into thirds I’ve got 3 lots of 33 dogs and 1 dog left over. So the issue is with my original idea of splitting the dogs into thirds, because clearly I haven’t got 100% in 3 lots because 1 of them is by itself.

Then you split the leftover dog into 10 parts. Why 10? Because you use base 10. Three of those parts go to each lot of dogs... and you still have 1/10 dog left.

Then you do it again. And you have 1/100 dog left. And again, and again, infinitely.

If you take that "infinitely" into account, then you can say that each lot of dogs has exactly one third of the original amount.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago)

Because it isn't 0.9; it's 0.999... with the ellipsis saying "repeat this to the infinite" being part of the number. And you don't need to round it up to get 0.999... = 1, since the 9 keeps going on and on, so their difference is infinitesimally small = zero.

Another thing showing that they're the same number is that there is no number between them. For example:

  • 0.9 (no ellipsis) and 1 are different because 0.95 is between them
  • 0.95 and 1 are different because 0.97 is between them
  • there's no number between 0.999... (with ellipsis) and 1, so they are the same. inb4 no "last nine" because it's infinite.
[–] [email protected] 20 points 11 months ago (6 children)

Based on a game* I think that the root issue is that there are multiple bottlenecks, unavoidable for the drivers, like turning or entering/leaving lanes, forcing them to slow down to avoid crashing. Not a biggie if there are only a few cars, as they'll be distant enough from each other to allow one to slow down a bit without the following needing to do the same; but once the road is close to the carrying capacity, that has a chain effect:

  • A slows down because it'll turn
  • B is too close to A, so it slows down to avoid crashing with A
  • C is too close to B, so it slows down to avoid crashing with B
  • [...]

There are solutions for that, such as building some structure to handle those bottlenecks, but they're often spacious and space is precious in a city. Or alternatively you reduce the amount of cars by discouraging people from using them willy-nilly, with a good mass transport system and making cities not so shitty for pedestrians.

*The game in question is OpenTTD. This is easy to test with trains: create some big transport route with multiple trains per rail, then keep adding trains to that route, while watching the time that they take to go from the start to the end. The time will stay roughly constant up to a certain point (the carrying capacity), then each train makes all the others move slower.

view more: ‹ prev next ›