A static website and Immich
A static website and Immich
So it’s supposed to be 15 hours/month included with your premium subscription? Since I’m not familiar with how Spotify audio books work, I thought you meant that you had a free account and was allowed to listen 15 hours to books that would be included/unlimited with a premium subscription. Contact support if it ate through your monthly credits faster that it should. If you’re a paying customer supports are usually quite helpful.
I’ve been using Intel NUCs, even though they have a lot of issues and start failing after about 3 years of heavy use. Previously used Kodi on Arch, but with the latest NUC I decided to go with Xubuntu and for some reason video playback doesn’t work in Kodi now. So instead I just use VLC media player for TV/movies and a web browser for everything else. Got a Logitech K400 Plus wireless keyboard which makes it easy to control the computer from the couch.
Less than a year after that mail Swedish laws were rewritten to make copying music and movies illegal.
There are tons of options for running LLMs locally nowadays, though none come close to GPT4 or Claude 2 etc. One place to start is /c/[email protected]
We mostly use discord since it’s difficult to convince people to sign up for new services… Have to do a workaround to stream desktop audio on Linux, since their client still only supports that for Windows, but other than that it usually works.
Tried https://twoseven.xyz/ a few times during a period when Discord streaming was lagging a lot. It supports desktop streaming with a browser plugin, and sync watching on various streaming services. As far as I can remember it worked ok but had a few issues, though that was a while ago so those might’ve been fixed.
Also tried to get https://sfu.mirotalk.com/ working but for some reason video wouldn’t show up…
From what I can find, Plex downloads subs from opensubtitles.org and they already exist there. I think the problem is that it treats “Star Wars” and “Star Wars Despecialized Edition” as the same movie
I put all the subs in a zip file, in case anyone finds that easier than hunting them down individually on opensubtitles: https://www.swisstransfer.com/d/2ab10863-e9f9-442b-9d2c-44f0711f8280
Max validity was 30 days, so if someone has the possibility host them more permanently others might appreciate it in the future.
The only info I have about the actual video files is that Star Wars is supposedly Despecilized Edition v2.5, while ESB and RotJ only come with a text file crediting Harmy. Perhaps the latter two are also v2.5 but I have no note of it.
It seems like the file is too large for the clipboard, so it only copies the first ~20kB out of the total 100kB. I could probably find some workaround, but it seems like the despecialized subs are already available at opensubtitles.org, though perhaps a little bit hard to find. See my other comment.
I had a look and it already has subs for the despecialized editions, however they seem to be mixed in with the other versions of the movie, making them a little tricky to find: https://www.opensubtitles.org/en/search/sublanguageid-all/idmovie-458
Maybe there’s an easier method, but I filtered by language and then used the browser’s “find in page…” to locate the ones with “despecialized” tags
I have a folder named “Subtitles - Project Threepio” for the first movie, plus .srt files for the other two despecialized editions. If noone else does it first I could upload them somewhere. Any good sites for sharing a few small files without having to register?
Static html+css page generated with this: https://github.com/maximtrp/tab
Do you mean that you want to build the docker image on one computer, export it to a different computer where it’s going to run, and there shouldn’t be any traces of the build process on the first computer? Perhaps it’s possible with the –output option… Otherwise you could write a small script which combines the commands for docker build, export to file, delete local image, and clean up the system.
https://github.com/miroslavpejic85/mirotalk might be an option. There’s both a server based version and a p2p version IIRC.
I asked someone about this a few days ago, and they claimed to have over 30000 photos in Nextcloud without issues
I suppose “a few” is quite open to interpretation, but I have 50k photos now so if it can handle 100k without getting sluggish it’ll probably be fine for the foreseeable future.
Does Nextcloud handle large numbers of photos nowadays? IIRC when I was comparing programs some years ago I read that both it and Owncloud struggled when you got to a few 10000s of photos.
Ah, nice.
Btw. perhaps you’d like to add:
build: .
to docker-compose.yml so you can just write “docker-compose build” instead of having to do it with a separate docker command. I would submit a PR for it but I have made a bunch of other changes to that file so it’s probably faster if you do it.
Awesome work! Going to try out koboldcpp right away. Currently running llama.cpp in docker on my workstation because it would be such a mess to get cuda toolkit installed natively…
Out of curiosity, isn’t conda a bit redundant in docker since it already is an isolated environment?
Hooray, I can finally play it. Had it on my wish-list for years, when I finally bought it I found out that neither the native Linux or the Windows+Proton version was working.