danielquinn

joined 1 year ago
[–] [email protected] 14 points 5 days ago

I just went through F-Droid and counted out all the projects I have on my phone. At £5 each I'm looking at an annual bill of about £70/year... Bargain.

Thanks for the idea!

[–] [email protected] 1 points 1 week ago

Monolith has the same problem here. I think the best resolution might be some sort of browser-plugin based solution where you could say "archive this" and have it push the result somewhere.

I wonder if I could combine a dumb plugin with Monolith to do that... A weekend project perhaps.

[–] [email protected] 1 points 1 week ago

Monolith can be particularly handy for this. I used it in a recent project to archive the outgoing links from my own site. Coincidentally, if anyone is interested in that, it's called django-cool-urls.

[–] [email protected] 6 points 1 week ago (1 children)

More of this please!

[–] [email protected] 5 points 1 week ago

ExFAT is good for portable devices, but if you're working with something internally, there's no reason not to use EXT4 or NTFS.

[–] [email protected] 3 points 1 week ago

That's not been my experience. Lots of drives I've bought have been FAT32 out of the box.

[–] [email protected] 3 points 1 week ago* (last edited 1 week ago) (4 children)
  • Keep everything in an external git service. You can use third party services like Codeberg, GitLab, or GitHub, or host your own on your NAS.
  • When you're not working on a project and don't think you'll need to reference it for a while, just delete it from your laptop. The code always lives in git anyway.

In terms of local storage, I usually have everything in ~/projects/project-name, and I don't have tiny file size limits because I don't use FAT32 filesystems — that's the default filesystem you usually get on USB sticks and external hard drives you buy. You have to format those drives to something like EXT4 (Linux) or NTFS (Windows) or you get stuck with FAT32 which has 2gb file sizes.

[–] [email protected] 12 points 1 week ago (4 children)

You probably want to look into Health Checks. I believe you can tell Docker to "start service B when service A is healthy", so you can define your health check with a script that depends on Tailscale functioning.

[–] [email protected] 2 points 2 weeks ago (1 children)

Well I just tried it again, and while it won't let me take a screen shot on the lock screen, it's definitely still the case for me. It just sits there ringing with the pattern lock on the screen and a little "Return to call" button at the bottom of the screen.

[–] [email protected] 3 points 2 weeks ago (4 children)

Could be. I do remember trying to get it to work a number of ways at the time. If you're telling me that this isn't the case for you though, I might try it out again.

[–] [email protected] 13 points 2 weeks ago (6 children)

I used this for a while, but every time my phone rang I had to type in my pin to answer it which was a deal breaker for me.

[–] [email protected] 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

So my first impression is that the requirement to copy-paste that elaborate SQL to get the schema is clever but not sufficiently intuitive. Rather than saying "Run this query and paste the output", you say "Run this script in your database" and print out a bunch of text that is not a query at all but a one-liner Bash script that relies on the existence of pbcopy -- something that (a) doesn't exist on many default installs (b) is a red flag for something that's meant to be self-hosted (why am I talking to a pasteboard?), and (c) is totally unnecessary anyway.

Instead, you could just say: "Run this query and paste the result in this box" and print out the raw SQL only. Leave it up to the user to figure out how they want to run it.

Alternatively you can also do something like: "Run this on your machine and copy/paste the output":

$ curl 'https://app.chartdb.io/superquery.sql' | psql --user USERNAME --host HOSTNAME DBNAME

In the case of the cloud service, it's also not clear if the data is being stored on the server or client side in LocalStorage. I would think that the latter would be preferable.

 

From time to time, often after I've restored from sleep or finished playing a Steam game, one of my CPU cores is pinned at 100% with no indication of what might be doing it. Running htop, btop, or GNOME system monitor all show the same thing: CPU0 at 100% while the rest are doing near-nothing, and no process in particular seems to be using those resources.

If I restart, it's back to normal, and sometimes I can play a game in Steam or let the computer go to sleep and it doesn't do this, but it happens often enough that's annoying/confusing so I'd like to know if there's a way to either (a) diagnose which processes are using which CPU cores, or (b) somehow "reset" the checking of these values to make sure that something's not just being misreported.

This is a desktop system running Arch & GNOME.

 

It would seem that I have far too much time on my hands. After the post about a Star Trek "test", I started wondering if there could be any data to back it up and... well here we go:

Those Old Scientists

Name Total Lines Percentage of Lines
KIRK 8257 32.89
SPOCK 3985 15.87
MCCOY 2334 9.3
SCOTT 912 3.63
SULU 634 2.53
UHURA 575 2.29
CHEKOV 417 1.66

The Next Generation

Name Total Lines Percentage of Lines
PICARD 11175 20.16
RIKER 6453 11.64
DATA 5599 10.1
LAFORGE 3843 6.93
WORF 3402 6.14
TROI 2992 5.4
CRUSHER 2833 5.11
WESLEY 1285 2.32

Deep Space Nine

Name Total Lines Percentage of Lines
SISKO 8073 13.0
KIRA 5112 8.23
BASHIR 4836 7.79
O'BRIEN 4540 7.31
ODO 4509 7.26
QUARK 4331 6.98
DAX 3559 5.73
WORF 1976 3.18
JAKE 1434 2.31
GARAK 1420 2.29
NOG 1247 2.01
ROM 1172 1.89
DUKAT 1091 1.76
EZRI 953 1.53

Voyager

Name Total Lines Percentage of Lines
JANEWAY 10238 17.7
CHAKOTAY 5066 8.76
EMH 4823 8.34
PARIS 4416 7.63
TUVOK 3993 6.9
KIM 3801 6.57
TORRES 3733 6.45
SEVEN 3527 6.1
NEELIX 2887 4.99
KES 1189 2.06

Enterprise

Name Total Lines Percentage of Lines
ARCHER 6959 24.52
T'POL 3715 13.09
TUCKER 3610 12.72
REED 2083 7.34
PHLOX 1621 5.71
HOSHI 1313 4.63
TRAVIS 1087 3.83
SHRAN 358 1.26

Discovery

Important Note: As the source material is incomplete for Discovery, the following table only includes line counts from seasons 1 and 4 along with a single episode of season 2.

Name Total Lines Percentage of Lines
BURNHAM 2162 22.92
SARU 773 8.2
BOOK 586 6.21
STAMETS 513 5.44
TILLY 488 5.17
LORCA 471 4.99
TARKA 313 3.32
TYLER 300 3.18
GEORGIOU 279 2.96
CULBER 267 2.83
RILLAK 205 2.17
DETMER 186 1.97
OWOSEKUN 169 1.79
ADIRA 154 1.63
COMPUTER 152 1.61
ZORA 151 1.6
VANCE 101 1.07
CORNWELL 101 1.07
SAREK 100 1.06
T'RINA 96 1.02

If anyone is interested, here's the (rather hurried, don't judge me) Python used:

#!/usr/bin/env python

#
# This script assumes that you've already downloaded all the episode lines from
# the fantastic chakoteya.net:
#
# wget --accept=html,htm --relative --wait=2 --include-directories=/STDisco17/ http://www.chakoteya.net/STDisco17/episodes.html -m
# wget --accept=html,htm --relative --wait=2 --include-directories=/Enterprise/ http://www.chakoteya.net/Enterprise/episodes.htm -m
# wget --accept=html,htm --relative --wait=2 --include-directories=/Voyager/ http://www.chakoteya.net/Voyager/episode_listing.htm -m
# wget --accept=html,htm --relative --wait=2 --include-directories=/DS9/ http://www.chakoteya.net/DS9/episodes.htm -m
# wget --accept=html,htm --relative --wait=2 --include-directories=/NextGen/ http://www.chakoteya.net/NextGen/episodes.htm -m
# wget --accept=html,htm --relative --wait=2 --include-directories=/StarTrek/ http://www.chakoteya.net/StarTrek/episodes.htm -m
#
# Then you'll probably have to convert the following files to UTF-8 as they
# differ from the rest:
#
# * Voyager/709.htm
# * Voyager/515.htm
# * Voyager/416.htm
# * Enterprise/41.htm
#

import re
from collections import defaultdict
from pathlib import Path

EPISODE_REGEX = re.compile(r"^\d+\.html?$")
LINE_REGEX = re.compile(r"^(?P<name>[A-Z']+): ")

EPISODES = Path("www.chakoteya.net")
DISCO = EPISODES / "STDisco17"
ENT = EPISODES / "Enterprise"
TNG = EPISODES / "NextGen"
TOS = EPISODES / "StarTrek"
DS9 = EPISODES / "DS9"
VOY = EPISODES / "Voyager"

NAMES = {
    TOS.name: "Those Old Scientists",
    TNG.name: "The Next Generation",
    DS9.name: "Deep Space Nine",
    VOY.name: "Voyager",
    ENT.name: "Enterprise",
    DISCO.name: "Discovery",
}


class CharacterLines:
    def __init__(self, path: Path) -> None:
        self.path = path
        self.line_count = defaultdict(int)

    def collect(self) -> None:
        for episode in self.path.glob("*.htm*"):
            if EPISODE_REGEX.match(episode.name):
                for line in episode.read_text().split("\n"):
                    if m := LINE_REGEX.match(line):
                        self.line_count[m.group("name")] += 1

    @property
    def as_tablular_data(self) -> tuple[tuple[str, int, float], ...]:
        total = sum(self.line_count.values())
        r = []
        for k, v in self.line_count.items():
            percentage = round(v * 100 / total, 2)
            if percentage > 1:
                r.append((str(k), v, percentage))
        return tuple(reversed(sorted(r, key=lambda _: _[2])))

    def render(self) -> None:
        print(f"\n\n# {NAMES[self.path.name]}\n")
        print("| Name             | Total Lines | Percentage of Lines |")
        print("| ---------------- | :---------: | ------------------: |")
        for character, total, pct in self.as_tablular_data:
            print(f"| {character:16} | {total:11} | {pct:19} |")


if __name__ == "__main__":
    for series in (TOS, TNG, DS9, VOY, ENT, DISCO):
        counter = CharacterLines(series)
        counter.collect()
        counter.render()
 

My father is 75 and not very capable on a computer. He's got an old MacBook Air at home behind a typical ISP router for which he has no access controls (so no port forwarding).

My immediate need is actually not his machine at all, but the Raspberry Pi I installed at his house before I left the country and forgot to enable cron on so it's not doing what I need yet. However, it would be really nice if I could also do one of the following as well:

  • VNC (or something) into his computer whenever something "isn't working" rather than doing the talk-him-through-it dance over Skype.
  • Install a new OS (the Mac is no longer supported by MacOS). I don't know how plausible this is though.

My current plan is to email him a shell script that should create a reverse SSH tunnel to a server in Montréal or something and then I can shell into his Mac through there. It's not ideal though since we're still talking shell scripts and he's easily frustrated.

I know that in Windows land there are all sorts of tools scammers use to take over a machine remotely. Does Mac allow for the same thing? Note that I only have Linux machines available to me on this side of the Atlantic.

 

I'm working on a some materials for a class wherein I'll be teaching some young, wide-eyed Windows nerds about Linux and we're including a section we're calling "foot guns". Basically it's ways you might shoot yourself in the foot while meddling with your newfound Linux powers.

I've got the usual forgetting the . in lines like this:

$ rm -rf ./bin

As well as a bunch of other fun stories like that one time I mounted my Linux home folder into my Windows machine, forgot I did that, then deleted a parent folder.

You know, the war stories.

Tell me yours. I wanna share your mistakes so that they can learn from them.

Fun (?) side note: somehow, my entire ${HOME}/projects folder has been deleted like... just now, and I have no idea how it happened. I may have a terrible new story to add if I figure it out.

 

His original post , titled I can't sleep, is some brilliant writing. When we talk about the chilling effect that criticism of Israel creates in industries everywhere (including ours) this is what that looks like.

 

[For reference, I'm talking about Ash in Alpine Linux here, which is part of BusyBox.]

I thought I knew the big differences, but it turns out I've had false assumptions for years. Ash does support [[ double square brackets ]] and (as best I can tell) all of Bash's logical trickery inside them. It also supports ${VARIABLE_SUBSTRINGS:5:12}` which was another surprise.

At this stage, the only things I've found that Bash can do that Ash can't are:

  • Arrays, which Bash doesn't seem to do well anyway
  • Brace expansion, which is awesome but I can live without it.

What else is there? Did Ash used to be more limited? The double square bracket thing really surprised me.

 

The other day someone was complaining about the new ad blocker-blocker on YouTube and I mentioned that it might be fun to write a Firefox extension that would just load up yt-dlp and play the video through mpv.

It turns out, writing a Firefox extension is easy and tricking Firefox into launching yt-dlp isn't much harder (though it does require some annoying configuration on the user's end).

Anyway, if you're a Linux user, feel free to try it out. I don't know how much I'm going to pour into this, but as an exercise of "can this be done", it was pretty good for a few hours on a Friday night.

view more: next ›