Intel Releases Open Source Encoder for Next-Gen AV1 Codec

(Image credit: AOMedia)

Intel published its own open source CPU-based encoder for the next-generation and royalty-free AV1 codec (a codec is a program for encoding / decoding a digital data stream or signal). Intel is one of the main founding members of the Alliance for Open Media (AOM), the non-profit group behind the development of the AV1 codec.

Intel SVT-AV1

Intel’s new encoder, called Scalable Video Technology AOMedia Video 1 (SVT-AV1), aims to fill the role of a good CPU-based encoding software tool until dedicated AV1 encoders are ready for prime time. The encoder supports the Linux, macOS and Windows operating systems.

A CPU-based encoder requires a beefy system, so it's no surprise the real-time encoding specifications for SVT-AV1 are no joke. SVT-AV1 requires Skylake-generation or newer Xeon processors with at least 112 threads and at least 48GB of RAM for 10-bit 4K video encoding. Outside of video streaming companies, these type of systems are out of reach for most. Consumers that want to encode AV1 videos may want to wait for dedicated AV1 encoding hardware to appear, which make take another year or so.

AV1 Moves Forward

The AV1 video codec is widely regarded as the next-generation codec to be adopted after h.264 and (to some degree) HEVC, not only because of significant improvements in video compression and the fact that it’s royalty free, but most of all because it seems that almost the entire technology industry is backing it through the AOMedia group. In the end, it’s adoption by video streaming sites, video-enabled applications and operating systems that will make or break a next-generation codec, and it seems that AV1 will get that support in spades.

Intel’s open source SVT-AV1 encoder joins the recently released open source dav1d AV1 decoder, so now video streaming companies have the full solution if they want to be early with their support for AV1 videos.

Lucian Armasu
Lucian Armasu is a Contributing Writer for Tom's Hardware US. He covers software news and the issues surrounding privacy and security.
  • derekullo
    Well at least the RAM part is relatively cheap, although 48 gigabytes is a rather strange number.

    The 112 threads is the killer.

    112 threads or 56 cores / 2 = 28 threads each for a dual socket/cpu system

    On Pcpartpicker, sorting by highest cores, the cpu with the most threads is the AMD Threadripper 2990WX with 32 threads, unfortunately it doesn't appear Threadripper is dual socket capable.

    Moving down the list we have a Intel Xeon E5-2699 V4 with 22 threads, unfortunately again 22 threads isn't enough threads leading me to believe a 4 socket system is the only way to support this.

    Unless Intel has new 28 or more cores processor with hyper-threading being released soon.

    This would make sense since they are pioneering the codec to begin with.
    Reply
  • extremepenguin
    You would think 48 GB is a strange number for ram, but in dual socket systems it is a somewhat common occurrence to have 48,96 GB in a system. Or at least it was 3-4 years ago when EEC was more expensive that it is now. Given the thread count required you are looking at a 2 socket system at a minimum, so this would fit with many common server builds from 3-4 years ago when they probably started the spec.
    Reply
  • DerekA_C
    threadripper 3 could end up with a 64core 128 thread with that chiplet 7nm stuff lol leaving 16 threads to spare.
    Reply
  • miguelportilla
    Seems like a job for the EPIC.
    Reply
  • photonboy
    EPYC not EPIC.
    Reply
  • rhysiam
    I see everyone talking about the insane hardware requirements for real-time encoding, but here's a question: why would any consumers need real-time encoding to 4K 10 bit now or any time in the next few years?

    I currently have a small collection of 4K HDR Blu-ray rips, but if I'm going to re-encode them, it'd be to reduce the file size for archive purposes, so there's no real-time requirement there.

    The only time I ever use real-time encoding is if I'm streaming to a device that requires a lower resolution and/or older codec (i.e, h.264). But why would any consumer need to "reduce" the quality to 4K 10bit?
    Some time in the future when I want to stream my 8K source content to my 4K HDR tablet, then I might care about real-time AV1 encoding. But right now, surely consumers would only ever use this for archive content?

    Am I missing something?
    Reply
  • ET3D
    That comment about 112 threads is really unclear. To quote: "at least 48GB of RAM is required to run a 4k 10bit stream multi-threading on a 112 logical core system". There's no indication of what the 112 logical cores are needed for, only that running on such a system requires 48GB. The user guide document omits the 112 logical core part when discussing RAM, simply saying that 48GB is required.

    It may be that there's a per-thread RAM overhead, and that 48GB is required on a 112 core system, but, for example, only 16GB would be required on a 32 core system. Impossible to say based on the phrasing on that page, and I could find no other mention of this that might shed more light on the subject.
    Reply
  • TerryLaze
    21742724 said:
    I see everyone talking about the insane hardware requirements for real-time encoding, but here's a question: why would any consumers need real-time encoding to 4K 10 bit now or any time in the next few years?

    I currently have a small collection of 4K HDR Blu-ray rips, but if I'm going to re-encode them, it'd be to reduce the file size for archive purposes, so there's no real-time requirement there.

    The only time I ever use real-time encoding is if I'm streaming to a device that requires a lower resolution and/or older codec (i.e, h.264). But why would any consumer need to "reduce" the quality to 4K 10bit?
    Some time in the future when I want to stream my 8K source content to my 4K HDR tablet, then I might care about real-time AV1 encoding. But right now, surely consumers would only ever use this for archive content?
    Am I missing something?

    Nah you wouldn't,even today qsv is capable of 4k h.265 10bit in real time,or at least it's super fast I'm not 100% sure on the real time.
    What I'm saying is that intel just tries to push this technology right now so it becomes the dominant codec,in the future intel will integrate this into qsv and every pentium will be able to do it.
    Reply
  • rhysiam
    21743053 said:
    Nah you wouldn't,even today qsv is capable of 4k h.265 10bit in real time,or at least it's super fast I'm not 100% sure on the real time.
    What I'm saying is that intel just tries to push this technology right now so it becomes the dominant codec,in the future intel will integrate this into qsv and every pentium will be able to do it.
    Yep, this is exactly my point. There's discussion about how the compute resources required for real time encoding (either hardware or software) are out of consumer hands, but I'm guessing by the time consumers have any vague requirement for real time 4K 10bit encoding, there will be consumer options available at reasonable prices.
    Reply
  • stdragon
    21742724 said:
    I see everyone talking about the insane hardware requirements for real-time encoding, but here's a question: why would any consumers need real-time encoding to 4K 10 bit now or any time in the next few years?

    Any consumer-based need to encode won't be software based, rather it will be via a dedicated ASIC chip embedded in devices like smart phones and GoPro units.
    Reply