I currently have Plex and Radarr. I only ever got 1080p content because that’s all my tv would do. Finally getting a nice new tv and would like to get a few 4K hdr movies to watch. Should I be looking for Bluray2160, Remux, or disk and what will Plex play?
As a general rule of thumb stay away from 4k HDR movies that are below 15Gb. This is because it has a lower bitrate and will look almost the same as 1080p content. If the file is below 15gb then dont touch it.
Have you seen the PSA x265 4K Versions? I find them relatively good. But what is your opinion?
Just the normal 2160p is probably what you want, remuxes are going to get very large with marginal benefit. You might have trouble if you’re using the built-in smart tv app for plex, but the plex app on chromecast will keep up fine.
As far as I remember, Remux is the only way to have True 10Bit HDR…
When compressed, HDR is either removed or not really working anymore - at least this was consensus some years ago.
There are quite a lot of h265 HDR rips available now, particularly for newer series released on Netflix etc. They definitely support full 10bit HDR and look good to my eyes.
Seconded; the idea that you need a remux for taking advantage of modern audio/visual tech was never really true
Bluray2160 and use mediainfo to make sure the codecs being used are AVC and AAC in an MP4 container.
If they have a 4K HDR capable TV they can playback HEVC. Does anyone even make x264 2160p releases? And container doesn’t matter either, Plex will remux it for you if necessary.
Would it make sense to set up a second radarr to only do 4K stuff? Or can I have radarr download 4K to a specific folder
It depends on what you want. If you plan on keeping two different libraries in Plex I would also keep two Radarr instances, but if you are going for a combined library you could stick to one and just use a different quality profile for the movies you want to be in 4K.
Yeah I run 2 instances of sonarr and radarr. It makes updating a bit fucky but works pretty good
I do that since I like to have both libraries separate in plex and have duplicates.
1080p for remote streaming and low end devices, 4K for local streaming and approved remote clients only. that minimises the frequency of a forced transcode 4k-1080 stream, when a better looking 1080p native file should have been available.
a second radarr instance lets me manage that separately, but I have them synced to eachother.
AAC is such a stupid codec to use for anything other than 2.0 channel.
5.1 / 7.1 AAC doesn’t work over ARC. Over eARC it need to be decoded to LPCM and then sent out. It doesn’t work with SPDIF.
Both eARC and SPDIF will most likely get downmixed to 2ch.
448 or 640kbps AC3 is perfectly fine and has great compatibility, SPDIF, ARC, eARC. Pretty much all blurays and DVDs will also have a AC3 track.
It’s going to depend on the TV, particularly if you are using the built in app rather than a proper external player.
Generally Remux is going to be the best, but you might need to track down non-DV versions, or specifically MP4 versions etc on some TVs.
My hope is to have either an Apple TV or test out the built in app for a Samsung
Ok, the built in sammy app is OK, but the files have to be the right format, so you will hit a problem file occasionally.
I’d use the AppleTV 4K as it has better format handling and supports higher bitrates.
Make sure your local network is up to scratch too, doesn’t matter how fast your internet is, if your TV is on wifi it might not be enough for Plex 4K streaming smoothly, and ethernet on most TVs is 100mbit, which can also be a problem for high bitrate 4K.
Watch out with the Apple TV, stay far from Dolby vision as the playback doesn’t support all profiles (really not a lot of profiles)
The problem with Samsung is that probably doesn’t support Dolby vision HDR format, so it’s transcoding galore.
Just for your information, your router/network must be robust enough. I was using the router provided by my ISP and the devices streaming from my NAS were being dropped when streaming 4k (~15GBs), I struggled because I thought it was a problem from the devices but it was in fact the shitty router. Once I updated, I had no problems anymore.
is streaming based on the internet speed? cause I live in 3rd world country & only have 50mbs
If it’s inside your own network, it depends on your local network speeds. Most routers usually have gigabit ethernet ports. WiFi depends on the signal quality
If outside your network, you’ll be capped by your upload speed
I would like to emphasize that it’s not only the speed of the local network connection. Also computing power of the router is important, as too much load will put a lot of strain in the CPU of the router.
That’s interesting, I’d never thought of that before. Is there some metric to measure this by? Like, do manufacturers report what cpu (chip?) their router has? I haven’t seen anything like that on listings for amateur products, at least.
Yes, most routers have a listed CPU speed, in GHz. But there are also routers with more CPU cores, so the raw speed can be a little misleading. It really depends on a lot of factors. Your router, modem, local network conditions, and the PC you’re running Plex on can ALL be potential bottlenecks when streaming.
All you need to know can be found on the excellent https://trash-guides.info/
Hello, i run a Plex server on my DS220+ NAS. Remux 2160p is the best choice here, and if you’re streaming over the internet (not LAN), the NAS with plex pass is strong enough to transcode.
While I can’t directly help you, I do know that unless you have some insane rig, you’ll never be able to transcode 4k. Which means, the player is by far the most important.
I haven’t messed with it in some time, but for awhile everyone basically only recommended an Nvidia shield because it seemed to direct stream nearly everything.
Transcoding 4K isn’t as hard as you think. I used an i5-9500T and the iGPU could easily transcode ~80GB 4K Blu-ray rips at double real time speed. I’ve now switched to software transcoding on a 5800x and it also exceeds real time speed.