[-] [email protected] 1 points 1 day ago

I didn't like the capitalised names so configured xdg to use all lowercase letters. That's why ~/opt fits in pretty nicely.

You've got a point re ~/.local/opt but I personally like the idea of having the important bits right in my home dir. Here's my layout (which I'm quite used to now after all these years):

$ ls ~
bin  
desktop  
doc  
downloads  
mnt  
music  
opt 
pictures  
public  
src  
templates  
tmp  
videos  
workspace

where

  • bin is just a bunch of symlinks to frequently used apps from opt
  • src is where i keep clones of repos (but I don't do work in src)
  • workspace is a where I do my work on git worktrees (based off src)
[-] [email protected] 14 points 1 day ago

Thanks! So much for my reading skills/attention span πŸ˜‚

[-] [email protected] 0 points 1 day ago

Which Debian version is it based on?

[-] [email protected] 9 points 1 day ago

Something that I'll definitely keep an eye on. Thanks for sharing!

[-] [email protected] 3 points 2 days ago

RE Go: Others have already mentioned the right way, thought I'd personally prefer ~/opt/go over what was suggested.


RE Perl: To instruct Perl to install to another directory, for example to ~/opt/perl5, put the following lines somewhere in your bash init files.

export PERL5LIB="$HOME/opt/perl5/lib/perl5${PERL5LIB:+:${PERL5LIB}}"
export PERL_LOCAL_LIB_ROOT="$HOME/opt/perl5${PERL_LOCAL_LIB_ROOT:+:${PERL_LOCAL_LIB_ROOT}}"
export PERL_MB_OPT="--install_base \"$HOME/opt/perl5\""
export PERL_MM_OPT="INSTALL_BASE=$HOME/opt/perl5"
export PATH="$HOME/opt/perl5/bin${PATH:+:${PATH}}"

Though you need to re-install the Perl packages you had previously installed.

[-] [email protected] 4 points 3 days ago

This is fantastic! πŸ‘

I use Perl one-liners for record and text processing a lot and this will be definitely something I will keep coming back to - I've already learned a trick from "Context Matching" (9) πŸ™‚

[-] [email protected] 1 points 4 days ago

That sounds a great starting point!

πŸ—£Thinking out loud here...

Say, if a crate implements the AutomatedContentFlagger interface it would show up on the admin page as an "Automated Filter" and the admin could dis/enable it on demand. That way we can have more filters than CSAM using the same interface.

5
submitted 1 week ago* (last edited 1 week ago) by [email protected] to c/[email protected]

cross-posted from: https://lemmy.ml/post/5193714

A few days DHH (from 37signals) wrote about how they moved off the cloud and how that has helped reduce their costs by a good measure.

Well, earlier today, he announced the first bit of tooling that they used as part of their cloud exit move: Kamal - which is already at version 1.0 and, according to DHH, stable.


I took a quick look at the documentation and it looks to me like an augmented and feature-rich Docker Compose which is, to no surprise, rather opinionated.

I think anyone who's had experience with the simplicity of Docker Swarm compared to K8s would appreciate Kamal's way. Hopefully it will turn out to be more reliable than Swarm though.

I found it quite a pragmatic approach to containerising an application suite with the aim of covering a good portion of a the use-cases and requriements of smaller teams.


PS: I may actually try it out in an ongoing personal project instead of Compose or K8s. If I do, I'll make sure to keep this post, well, posted.

[-] [email protected] 10 points 1 week ago

That was my case until I discovered that GNU tar has got a pretty decent online manual - it's way better written than the manpage. I rarely forget the options nowadays even though I dont' use tar that frequently.

[-] [email protected] 22 points 1 week ago

This is quite intriguing. But DHH has left so many details out (at least in that post) as pointed out by @[email protected] - it makes it difficult to relate to.

On the other hand, like DHH said, one's mileage may vary: it's, in many ways, a case-by-case analysis that companies should do.

I know many businesses shrink the OPs team and hire less experienced OPs people to save $$$. But just to forward those saved $$$ to cloud providers. I can only assume DDH's team is comprised of a bunch of experienced well-payed OPs people who can pull such feats off.

Nonetheless, looking forward to, hopefully, a follow up post that lays out some more details. Pray share if you come across it πŸ™

19
submitted 2 weeks ago by [email protected] to c/[email protected]

cross-posted from: https://lemmy.ml/post/4908824

There are two major flavours of variables in GNU Make: "simple" and "recursive".

While simple variables are quite simple and easy to understand, they can be limiting at times. On the other hand, recursive variables are powerful yet tricky.

...

There is exactly one rule to recall when using recursive variables...

🧠 The value of a recursive variable is computed every time it is expanded.

[-] [email protected] 9 points 2 weeks ago

Interesting topic - I've seen it surface up a few times recently.

I've never been a mod anywhere so I can't accurately think what workflows/tools a mod needs to be satisfied w/ their, well, mod'ing.

For the sake of my education at least, can you elaborate what do you consider decent moderation tools/workflows? What gaps do you see between that and Lemmy?

PS: I genuinely want to understand this topic better but your post doesn't provide any details. πŸ˜…

[-] [email protected] 7 points 2 weeks ago* (last edited 2 weeks ago)

That single line of Lisp is probably (defmacro generate-compiler (...) ...) which GCC folks call every time they decide to implement a new compiler πŸ˜†

162
submitted 3 weeks ago by [email protected] to c/[email protected]

From GNU lists earlier today:

We have learned with deep sadness that Thien-Thi Nguyen (ttn) died in October 2022. Thien-Thi was a hacker, artist, writer, and long-time maintainer and contributor to many GNU programs as well as other free software packages. He was the GNU maintainer of the rcs, guile-sdl, alive, and superopt packages, and he was working on GNU Go as well.

Thien-Thi especially loved GNU Emacs, GNU Taler, and GNU Go: he was the author and maintainer of the xpm, gnugo, ascii-art-to-unicode, and hideshow GNU Emacs packages and made substantial contributions to many others such as vc, as well as to GNU Taler and its documentation.

We greatly miss Thien-Thi in the free software community - his death is a great loss to the Free World.

33
submitted 3 weeks ago by [email protected] to c/[email protected]

cross-posted from: https://lemmy.ml/post/4560181

A follow up on [DISCUSS] Website to monitor Lemmy servers' performance/availability


I wanted to experiment w/ Lemmy's APIs to, eventually, build a public-facing performance monitoring solution for Lemmy.

It started w/ a couple of shell commands which I found myself repeating. Then I recalled the saying "Don't repeat yourself - make Make make things happen for you!" and, well, stopped typing commands in bash.

Instead I, incrementally, wrote a makefile to do the crud work for me (esp thanks to its declarative style): https://github.com/bahmanm/lemmy-clerk/blob/v0.0.1/run-clerk


TBH there's nothing special about the file. But I thought I'd share this primarily b/c it is a demonstration of the patterns I usually use in my makefiles and I'd love some feedback on those.

Additionally, it's a real world use-case for bmakelib (a library that I maintain 😎 )

57
submitted 3 weeks ago* (last edited 3 weeks ago) by [email protected] to c/[email protected]

I am not the author.

https://github.com/galdor/github-license-observer

https://addons.mozilla.org/en-GB/android/addon/github-license-observer/

This is a cool little addon to help you tell, at a glance, if the repository you're browsing on github has an open source license license.

Especially relevant nowadays given the trend to convert previously OS repos to non-OS licenses as a business model (eg Akka or Terraform.)

1
submitted 1 month ago* (last edited 1 month ago) by [email protected] to c/[email protected]

Got a notification from LinkedIn saying "You're one of the few experts who have been invited to collaborate on ..." I got curious and opened up the link.


Apparently, now instead of professional writers being paid to pen down their, usually, cohesive & authentic views, LinkedIn is trying out the idea of generating content using an LLM and then asking for free editorial services from users in exchange for "badges" 🀯 πŸ€¦β€β™‚οΈ

This is cheap IMO. Even for LinkedIn.

What's happened to the "content team" at LinkedIn!?

102
submitted 1 month ago* (last edited 1 month ago) by [email protected] to c/[email protected]

I thought I'd share how happy I've been w/ my Gnome experience these past few years despite the occasionally controversial UI/UX decisions the Gnome folks tend to make.

I use Gnome Online Accounts integration w/ Google (drive, e-mail, calendar & contacts) and it "just works"β„’ & it does so quite reliably.

It's so polished & well-integrated in the desktop that I often don't even notice that I'm using in on a daily basis ❀️

PS: I'm using Gnome 44.3 on openSUSE Tumbleweed running on an old ThinkPad T530 w/ an nVidia GPU.

4
submitted 1 month ago by [email protected] to c/[email protected]

cross-posted from: https://lemmy.ml/post/4079840

"Don't repeat yourself. Make Make make things happen for you!" 😎

I just created a public room dedicated to all things about Make and Makefiles.

#.mk:matrix.org
or
matrix.to/#/#.mk:matrix.org

Hope to see you there.

1
submitted 1 month ago by [email protected] to c/[email protected]

TIL that I can use Perl's Benchmark module to time and compare the performance of different commands in an OS-agnostic way, ie as long as Perl is installed.

For example, to benchmark curl, wget and httpie you could simply run:

$ perl -MBenchmark=:all \
     -E '$cmd_wget    = sub { system("wget  https://google.com > /dev/null 2>&1") };' \
     -E '$cmd_curl    = sub { system("curl  https://google.com > /dev/null 2>&1") };' \
     -E '$cmd_httpie  = sub { system("https https://google.com > /dev/null 2>&1") };' \
     -E '$timeresult  = timethese(15, { "wget" => $cmd_wget, "curl" => $cmd_curl, "httpie" => $cmd_httpie });' \
     -E 'cmpthese($timeresult)'

which on my old T530 produces:

Benchmark: timing 15 iterations of curl, httpie, wget...

      curl:  2 wallclock secs ( 0.00 usr  0.00 sys +  0.42 cusr  0.11 csys =  0.53 CPU) @ 28.30/s (n=15)
    httpie:  8 wallclock secs ( 0.00 usr  0.01 sys +  4.63 cusr  0.79 csys =  5.43 CPU) @  2.76/s (n=15)
      wget:  3 wallclock secs ( 0.00 usr  0.00 sys +  0.53 cusr  0.19 csys =  0.72 CPU) @ 20.83/s (n=15)
    
         Rate httpie   wget   curl
httpie 2.76/s     --   -87%   -90%
wget   20.8/s   654%     --   -26%
curl   28.3/s   925%    36%     --

Very handy indeed ❀

2
submitted 1 month ago* (last edited 1 month ago) by [email protected] to c/[email protected]

When you open a new tab, you can instantly start typing and press ENTER which sends your query to the search engine.

However once that's done, there's no easy way to edit the query directly from the URL bar. The URL bar will contain, well, the URL and not the original query anymore.

Is there a way to edit the search query w/o using the search engine's web page or retyping the whole query again? In other words, is there a way to tell Firefox to show me the previous query in the URL bar instead of showing the URL?

I'd like to try to send as many queries as possible to Google directly from Firefox rather than using Google's webpage (more $$$ for Firefox.)

An example where I searched for Lemmy and tried to edit the search query

1
submitted 1 month ago* (last edited 1 month ago) by [email protected] to c/[email protected]

cross-posted from: https://lemmy.ml/post/3413371

I've had the (mis)fortune to deal w/ a good number of Makefiles over the years. Enough to take a liking to Gnu Make πŸ€·β€β™‚οΈ

I've been haphazardly compiling a collection of common tasks that I'd like my Makefiles to do out-of-the-box for me.

The collection grew to a point where I thought it might benefit my fellow engineers in their day-to-day programming.

Hence bmakelib was born w/ an Apache v2.0 license πŸ‘Ά

It is essentially a collection of useful targets, recipes and variables you can use to augment your Makefiles.

The aim is not to simplify writing Makefiles but rather help you write cleaner and easier to read and maintain ones.

Everything is well tested via CI pipeline (yes, I wrote unittests for Makefiles 😎) and should work out of the box.

Please take a look and let me know what you think. I'd love to hear your thoughts and possibly your experience if you ever use bmakelib in your projects.

1
submitted 1 month ago by [email protected] to c/[email protected]

cross-posted from: https://lemmy.ml/post/3229278

Suppose I've got a simple #Makefile w/ a few URLs that I'd like to process as dynamic targets.

For example here is a not working snippet:

.DEFAULT_GOAL := all

#####
URLS  = https://foo.example.com
URLS += https://bar.example.com
URLS += https://www.example.org

#####
% :
	@echo the url is $(*)

#####
.PHONY : all
all : $(URLS)

It fails w/

*** target pattern contains no '%'

I believe that's b/c of the character : being part of URLS which confuses Make after expansion (order o

As a workaround, I've removed https:// from all URLs. For example this works:

URLS = foo.example.com
URLS += bar.example.com

I know Make generally doesn't play well w/ targets w/ space or colon in the name but I wonder if the above is the best I can do. What do you think?

view more: next β€Ί

bahmanm

389 post score
162 comment score
joined 3 months ago