Stack Exchange network consists of 183 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.
That'll do it. According to the docs$1 is the GID (assuming this is some kind of aria specific thing and not the UNIX gid) and $2 is the number of files (1 for http) if anyone is interested. Thanks! :)
@JeffSchaller Oh that's true. I bet with some pipes you could even communicate wget's PID to the monitoring process to get the right one. This removes the ability to have a dependency between wget and the process though. For example you can't do wget -i $(./monitoring-process) and have it emit new files to download. I'm probably pushing against the limits of what I should be doing here (before I should just throw everything into a script). I'm trying to lean on wget as much as possible, because it does its job well!
@Httqm That could work if there was a way to distinguish between separate files. Piping loses the file name and makes it difficult (impossible?) to separate the files. Note that I'm using wget -i to download multiple URLs here.
@JeffSchaller Yeah that's a decent solution. However, the last file not having a newer file makes it hard to come up with a reasonable termination condition (timeout isn't great depending on file size and connection speed, fuser just for the last is less fragile but bad). I guess I could add a sentinel dummy file URL, but it feels like there should be a better way!
It seems like your client is trying to load a bunch of keys and then failing. What does your ~/.ssh/config look like? You can either create a key (ssh-keygen) and put it in your ~/.ssh/authorized_keys or you can tweak your config to allow password login.