Yeah, I've had a cifs share in my fstab before, mounting it to a folder in my home, and I took the PC off-site for a lan party, and just trying to ls
my home dir took forever for some reason. Commenting it out and restarting fixed it all.
Good luck with the new install!
Any input to the 2nd LLM is a prompt, so if it sees the user input, then it affects the probabilities of the output.
There's no such thing as "training an AI to follow instructions". The output is just a probibalistic function of the input. This is why a jailbreak is always possible, the probability of getting it to output something that was given as input is never 0.