Skip to content(if available)orjump to list(if available)

Why Apple's Severance gets edited over remote desktop software

jedberg

I use Keynote to make my presentations, and one time I wanted to build a presentation with someone else. I asked my friend who has worked at Apple for 20 years, "How do you guys build Keynote presentations together? There doesn't seem to be an easy way to do that?".

He said, "We don't collaborate at Apple because of the (perceived) risk of leaks. None of our tools are built for collaboration". Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.

So it doesn't surprise me that their video editing tools are designed for a single user at a time.

Edit: This happened about six years ago, they have since added some collaboration tools, however it's more about the attitude at Apple in general and why their own tools lag on collaboration.

Edit 2: After the replies I thought I was going crazy. I actually checked my message history and found the discussion. I knew this happened pre-COVID, but it was actually in 2013, 12 years ago. I didn't think it was that long ago.

aschobel

I've been working at Apple for almost 12 years. While secrecy is indeed paramount, once a tool is internally blessed, we collaborate normally using it. Keynote collaboration is actually pretty standard nowadays.

Opinions are my own and do not reflect those of my employer.

DidYaWipe

Not to mention that blanket statements about Apple are absurd. I was a developer there for a decade, and every group was different.

I love reading articles that purport to tell the public how things at Apple work. They're almost always laughably full of shit.

ipaddr

Didn't the article say some floors require keys for different offices and sometimes filing cabinets.

That implies every floor is different which matches what you are saying.

Most of the stories that have come out felt like they were the image Apple wanted to give. It started with Apple going after missing iphone that was left at a bar. We've heard those working on latest design for the next iphone were sequestered away from the rest of the company. I've always thought it was marketing spin and I'm glad we have an ex-apple employee confirming this. Back in the 'Lisa' days Apple did split and silo divisions, Apple did closely guard new iPhone designs with very few leaks happening but the rest of the mythology is more marketing.

Spooky23

Anything Apple gets attention. But any large organization does various forms of segmentation. Many of these stories are “true”, but also bullshit.

I worked for a company that did some work for the federal government. Boring stuff. Their compliance rules essentially required that we firewall the folks with operational access to their data from the rest of the company. We included the physical offices in that to avoid certain expenses and controls companywide.

ignoramous

> Not to mention that blanket statements about Apple are absurd

It isn't absurd as what GP mentions was imported into Amazon by Dave Limp, a former Apple C-suite. It was a terrible culture shock for most of the ICs in my team being reorg'd reporting in to Limp, after Steve Kessel (of Kindle fame), the previous leader, went on a sabbatical.

tehnub

What do you risk by not giving that disclaimer?

gorlilla

The company claiming something you said, even out of context, could be interpreted as coming from the company. If you choose to disclose you work for a company, you become a spokesperson for that company unless you disclaim those words (even then, there are other considerations to make regardless of whose opinion is being expressed, because you linked yourself to the company.).

By putting that, they decrease the likelihood of reprocussion in the workplace for things said outside of the workplace.

You can still get in hot water for anything you say that ties back to you or the company regardless if you disclose who your employer is.

This is the grey-area that corporations typically carve out in a social-media policy so that employees can engage in discussions around their employer without being on behalf of their employer.

It's still a perilous position to put yourself in as an employee. Innocent and innocuous things can always be misunderstood or misinterpreted.

What happens when you use that disclaimer and are self-employed though?

galad87

That's a weird answer, Keynote can shares presentations, and multiple people can work on the same presentation in real-time, either on the macOS/iOS or the web version. The feature has been available for years: https://support.apple.com/en-us/guide/keynote/tan4e89e275c/m...

jedberg

> Note: Not all Keynote features are available for a shared presentation.

That's the main issue. But also this happened about six years ago.

galad87

The collaboration features were introduced in 2013 on the web version, and in 2016 on the native versions. And maybe check which features are actually not available before dismissing it.

jbverschoor

Six years ago Keynote supported simultaneous editing through share with iCloud

cptskippy

> To collaborate on a shared presentation, people you share with need any of the following:

>

> A Mac with macOS 14.0 or later and Keynote 14.3 or later

>

> An iPhone with iOS 17.0 or later and Keynote 14.3 or later

>

> An iPad with iPadOS 17.0 or later and Keynote 14.3 or later

Those OSes were released around June of 2023, so a little over a year?

dcrazy

The documentation always refers to the current versions of the software, and the latest version of iWork always requires being on latest or near-latest OS. Collaboration also requires all clients to be on the latest version of the software.

null

[deleted]

null

[deleted]

mort96

[flagged]

Andrex

Then the question becomes why raise this irrelevant anecdote? The OP didn't research either.

eschaton

They were BSing you or working in a different part of the company than SWE.

Back in the day Keynote files would just be passed around via a shared server so you and the people you were collaborating with could make and merge changes between them, eg I’d do one part of a presentation, Rick would do another part, and we’d copy our slides out of and paste them into each others’ decks to get a complete version for rehearsing with. If we had notes for each other, we’d give each other the notes out of hand rather than just directly change each others’ slides.

There’s a lot of mythology that people just make up about how secrecy works at Apple. It’s mostly sensible.

carlmr

>Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.

The severed floor.

eschaton

“Severance” is exactly how Apple’s New Product Security and Public Relations organizations would like all employees to be, to an absolute T. However, the rest of the company is much more pragmatic and understands well the value of collaboration and employees having enriched lives that they share with the workplace, since that leads to greater innovation and works well as a recruiting tool as well.

JKCalhoun

> We don't collaborate at Apple because of the (perceived) risk of leaks.

That sentence, by itself, is more or less correct (from my 26 years at Apple). However, it suggests/implies things that are not correct.

1) In case you got the impression: Apple certainly does not design software to be non-collaborative simply because it would enable sharing/leaking when used within Apple. I would say that Apple has been focused since Day 1 on a mindset where one-computer equals one-user. The mindset was that way really until Jobs was fired, discovered UNIX, and then returned with Log In and Permissions. To this day though I think collaboration is often an afterthought.

So too do they seem to be focused on the singular creative. I suspect Google's push into Web-based (and collaborative) productivity apps (Google Docs, etc.) forced Apple's hand in that department — forced Apple to push collaborative features in their productivity suite.

2) Of course Apple collaborates internally. But to be sure it is based on need-to-know. No one on the hardware team is going to give an open preso in an Apple lunchroom on their hardware roadmap. But you can bet there are private meetings with leads from the Kernel Team on that very roadmap.

That internal secrecy, where engineers from different teams could no longer just hang out in the cafeteria and chat about what they were working on went away when Jobs came back. It probably goes without saying it was rigorously enforced when the iPhone was a twinkle in Apple's eye.

The internal secrecy was sold to employees as preserving the "surprise and delight" when a product is finally unveiled but at the same time, as Apple moved to the top of the S&P500, there were a lot of outsiders that very definitely wanted to know Apple's plans.

3) Lastly, yes, plenty of floors and wings of buildings are accessible only with those with the correct badge permissions. I could not, for example, as an engineer badge in to the Design floor.

Individual cabinets needing badge access? I have no idea about that. I am aware of employees hanging black curtains in their office windows when secret hardware would come out of their (key-locked) drawers. (On a floor that is locked down to only those disclosed, obviously the black curtains become unnecessary.)

daniel_reetz

This matches my experience. In addition I was advised/strongly encouraged to "go dark" on social media and refrain from ever discussing work at lunch, even with teammates.

My badge only worked where I had explicitly been given access, and desks were to be kept clear and all prototypes or hardware had to be locked in drawers and/or covered with black cloths. Almost every door was a blind door with a second door inside, so that if the outer one opened, it was not possible to see into the inner space.

mattl

Keynote and Numbers are interesting apps.

Both are designed to replicate the same functionality as Concurrence and Quantrix (itself a clone of Lotus Improv) both by Lighthouse Design, who made lots of apps for NeXTSTEP and were purchased by Sun.

Steve Jobs used Concurrence on a ThinkPad and also a Toshiba laptop to make presentations prior to Keynote (which I believe was created internally for him at first) even while back at Apple.

js2

> I knew this happened pre-COVID, but it was actually in 2013.

Real-time collaboration was added in Keynote 7.0 released in Sept 2016.

https://www.macworld.com/article/228811/keynote-pages-and-nu...

llm_nerd

>So it doesn't surprise me that their video editing tools are designed for a single user at a time.

The editors of Severance are actually using Avid. For music composition they're using Albeton. Neither are Apple products. The remote desktop product they're using is Jump Desktop.

While the show is an Apple TV+ show, and they happen to use to Macs in the process, this has shockingly little to do with Apple tools or products.

sinoue

Good point. No Final Cut. No Logic Pro. Apple & Adobe are missing out.

KaiserPro

Ex VFX person here.

It was quite common to have remote desktop cards on high end machines so that you could hide them away somewhere quiet. The edit stations/Flame/Baselite machines all hada fucktonne of 15k sas drives in them, so were really noisy.

You couldn't invite a director to see what you were doing, when all you can hear is disk/fan whine.

They were quite expensive because they needed to be able to encode and send 2k video in decent bitdepth (ie not 420, but 444), and low latency. Worse still they needed to be calibrateable so that you could make sure that the colour you saw was the colour on the other end.

Alas, I can't remember what they are called, thankfully, because they are twats to manage.

da_chicken

This is a pretty common problem with all true workstation level computer systems. It's like taking a rack from a data center and putting it in your office. You've got a dozen or more spindles and fans spinning. I've seen systems with $200,000 worth of RAM in them, but that was back when 256 GB of RAM was $100k. And, yeah, they had 15k SAS drives. If you think servers are expensive, you've not priced workstations.

Every time I've seen higher end workstations, the actual workstation itself was always in a separate room, and there's been some kind of remote KVM solution used. The workstation was always very noisy and generated a lot of heat. It's also just... a lot of money to shove under a desk where people kick it all afternoon.

nativeit

I do I.T. for a small broadcast studio (it's actually a sports venue, but they have their own production and broadcast studio), and it is indeed still very much like this. We have rack-mounted workstations alongside all the servers and networking, with KVMs to the next room where the production is handled. This was all spec'd and built out in 2019.

MasterScrat

By KVM, do you mean actual cables for keyboard, video and mouse going from the workstations to the user, or some kind of remote desktop tools?

jdietrich

A number of manufacturers offer soundproof 19" rack enclosures, but they are heavy, bulky and not especially cheap.

https://www.rackmountsolutions.net/24u-ucoustic-soundproof-s...

geocar

Not exactly sure how that compares, but I bought one of these quite hopefully: https://www.apc.com/us/en/product-range/203414049-netshelter... and "soundproof" means my home office doesn't sound like sitting on a subway train, and more like the inside of an airport.

mcoliver

Probably Miranda. Brings back a lot of memories from the flint/flame/inferno days. I remember buying a tezro for ~150k USD in 2005/6. We also were "gifted" an Inferno around that time which I heard originally cost multiple hundreds of thousands. When it showed up it was the size of a refrigerator and took dual 30A power feeds. Sounded like a jet and didn't last long.

Teradici came on the scene and started running everything over IP. Hardware at first (old EVGA pyramids were everywhere) where you had to route the video out into a custom card that then put out the signal via IP.

Now it's all software with the leaders being teradici (merged with HP anywhere which came from IBM), nicedcv (Amazon), parsec, and a few others.

The big advantage in content production over something like vcn/rdp was color fidelity, local cursor termination, and support for hardware like Wacom tablets. You can even do 7.1 audio and multiple monitors. Turns out when you are an artist having a local like feel is incredibly important. 60fps is 16ms per frame. So even with virtual workstations on AWS you want to deploy them in a region that is relatively close to the end user.

mulmen

Why can’t you just run a longer cable into a temperature/noise controlled room nearby?

KaiserPro

Good question!

So there are a couple of options, depending on the hardware. If it kicked out HD-SDI you could just patch the display into the coax in the building and have done with it.

But that only worked if you were in the same building and your machine kicked out HD-SDI

Most machines either shat out dual-link DVI or worse, some custom shit. Getting a cable that can reliably transport dual-link DVI >10 meters was difficult and expensive. Worse still, it had a habit of dropping back to single link, or some other failure mode that was everso annoying to debug. More over, 10 meters often isn't far enough. Especially if the room had a projector (so might be >5m long throw.)

Now, thats the simple case. The hard case is multi-building. Say, you have an operator working in london, and the director in new york, you want to give them the highest quality picture possible. The only way to do that at the time was with one of these cards, or some nasty SDI-hardware h264 transcoder (hugely expensive at the time)

I really wish I could remember what they were called. They appear to have fallen out of favour.

Now, you'd just use cynesync, as you're laptop can encode video in real time now (https://www.backlight.co/product/cinesync) Also, rumour has it that the wolverene movie was leaked because a producer got coked up and left an unencypted laptop on a plane, rather than using cynesync to show an edit to someone important. Alas I can't verify that.

walrus01

I'd love to hear more stories about coked-up producers in the film industry from 15+ years ago. Having done technical codec work adjacent to some of it in the past, it's a wild business to be in.

mulmen

> Say, you have an operator working in london, and the director in new york, you want to give them the highest quality picture possible.

This is exactly the insight I was hoping for. Thank you.

7a1c9427

Were they Teradici cards?

walrus01

Based on the comment of 15k spinning drives this must have been quite some time ago, but there's very definite reach length limits on DVI and displayport cables. Let's say this was in 2007 and the maximum state of the art was a dual link DVI 2560x1600 display, you can't extend that in any practical way beyond about 15 feet. Extending USB keyboard and mouse by comparison is trivial. Unless all of the desks and workstations were set up directly on one side of an acoustic barrier wall, a hard problem to solve.

diggan

> you can't extend that in any practical way beyond about 15 feet

For passive cables, that makes sense. But with repeaters, wouldn't you be able to go further? Maybe cable repeaters like that are newer than I imagine.

KennyBlanken

You can, and people do.

There are a slew of HDMI extension systems, some that even use ethernet with hardware encoding/decoding. Grandparent commenter hasn't worked in the industry in at least a decade if they're talking about DVI.

m0dest

These days, if you're just wiring to a single workstation in a nearby next room, 50 meter active optical Thunderbolt 3/4 cables can carry 5K+ DisplayPort video passthrough and data from your USB peripherals.

(It's "passthrough" and not "uncompressed" because DisplayPort may use DSC depending on the resolution and frame rate.)

US$500 for an optical cable can be a lot cheaper than paying for HDMI extender sender and receiver boxes.

walrus01

In a modern video editing system it's still a non trivial challenge, because you can't just go using any COTS HDMI extension system, which might be good for 2160p30 at 420 color space, or maybe 2160p60 at 420 color space, but may NOT be capable of 2160p60 at 422 or 444 color space. Or may not function for DCI resolution at 4096x2160. Or anything 8K.

There's plenty of HDMI2.0 compliant "video over ethernet cable or fiber things" which are the ordinary COTS products that may not be sufficient for serious video editing needs.

People on video editing workstations these days are using higher end monitors that can be trusted to work in 10bit color and to match a certain color space grading.

On the other hand it's a lot easier these days to have a relatively quiet video editing workstation that has 8 to 16TB of local, pci-express bus attached NVME storage for work space, and that same workstation can have a not-very-expensive 100GbE NIC in it attached to some large/noisy storage elsewhere.

KaiserPro

> some that even use ethernet with hardware encoding/decoding

We had those, the problem is that they loose bitdepth. They were also fucking unreliable. We had a lot HDMI extenders and they worked for 1920x1080, and sometimes 2k if you were lucky.

we used them for the "prosumer" LCD projectors we had the in the review rooms. They didn't work so well for the massive christie projectors. (I seem to recall they abused 3g-SDI to get resolution)

doctorpangloss

Guys c'mon... The desk is set dressed. Nothing in the photos makes any sense. Last of all, Geoffrey Richman isn't doing editing work in Ben Stiller's apartment.

> Geoffrey Richman reviews season two finale footage. In his at-home edit bay (not pictured), he works on iMac, which remotes into a separate Mac mini that runs Avid from a post-production facility in Manhattan’s West Village.

Yeah. That would be a horrible experience.

KaiserPro

One director had their sofa shipped into the Digital-intermediary room (it had a 2k calibrated digital projector) for 4 months. An artist has got to be comfortable....

Karrot_Kream

Did y'all run the remote desktop over specific networks?

KaiserPro

for VFX, disney/marvel/fox/sony required that the entire network be air gapped, with really stringent rules on USB, data tracking and interchange. All internet access had to be done via RDP with copy/paste blocked.

Had sony bothered to follow it's own rules it wouldn't have been hacked and had all its data leaked in 2014.....

But to answer the question, we had a shit tonne of networking, so as far as I'm aware it was just on the vanilla network. Might have been a seperate VLAN though.

viraj_shah

This is a tangent but what was your journey into VFX?

KaiserPro

I studied digital media in uni. But I had been using linux since about 1999.

I wanted to be a compositor, but failed the rotoscoping test at the company I was working at. So I fell back on my technical skills, and became an infra engineer. I left VFX in about 2015, and sadly no matter how much I want to go back, I don't see much of a future in it. GenAI is really going to do a number on it.

diggan

> I left VFX in about 2015, and sadly no matter how much I want to go back, I don't see much of a future in it. GenAI is really going to do a number on it.

I don't think Generative AI will make entire industries disappear, but rather make people within those industries do more with less. Seeing as you somewhat see what future of the industry is, and assuming you're right, it puts you in a good position to gain the skills you think will be sought after. You have the technical skills too seemingly. Just an idea, I'm not working in either areas so take it with a bit of salt I suppose.

DidYaWipe

Interesting, but this misses perhaps the most embarrassing part: They're using Avid and not FCP.

I also don't buy the author's rationale for remote editing; it's oddly archaic: "high-end video production is quite storage-intensive, which is why your favorite YouTuber constantly talks about their editing rigs and network-attached storage. By putting this stuff offsite, they can put all this data on a real server."

Storage is cheap now, and desktop computers are more than powerful enough for any video editing. Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet. The primary benefit of remote editing (and the much-hyped "camera to cloud") is fast turnaround, which you need for stuff like reality TV and news. But a dramatic series like Severance?

It is pretty baffling that Apple would create a PR vehicle that impugns its products like this. It would be better to say nothing. After Apple acquired Shake, they splashed Lord of the Rings, King Kong, and other major tentpoles on the Apple homepage at every opportunity... of course not mentioning that Weta was rendering those movies on hundreds of Linux servers instead of Macs. But at least Shake was the same product across all platforms, and it really was the primary effects tool on all those movies.

"they do not mention the use of Jump Desktop, which seems like a missed opportunity to promote a small-scale Mac developer. C’mon Apple, do better.)

Oh boy, this is just a minor infraction in Apple's history of disrespect toward developers. They do this, and worse, to major development partners too. I'm not going to name names, but after one such partner funded the acquisition of material on its own equipment and that material was used in a major product keynote... Apple not only neglected to credit or even mention that partner, but proceeded to show the name of a totally uninvolved competitor in its first slide afterward. The level of betrayal there was shocking.

chippiewill

The storage requirements are still massive. I would guess the raw footage for something like severance (and they probably shoot in at least 4k) is going to be in the area of a petabyte for the entire season.

Even today it's not close to practical to have an entire episode's worth of raw footage (of which there'll be many many takes, many many angles) entirely on an editor's workstation.

The surprising aspect is that they don't use proxies for editing rather than remote desktop.

jiveturkey

Ben Stiller claims just 83 terabytes for editing. Maybe this is the size of proxies. https://www.youtube.com/watch?v=TXNQ01Sy6Xw&t=45s

epcoa

83 terabytes of raw footage for one episode (the S2 finale). This was the longest episode (which doesn't necessarily correspond with footage shot). But for a 10 episode 4K HDR series, 1 PB is in the ballpark for a season.

jpc0

> The surprising aspect is that they don't use proxies for editing rather than remote desktop.

In my experience it is way easier to scale storage bandwidth than compute, atleast locally.

There has been times where I've been able to cut a shoot from the raw files, and this has beeen corroborated by other editors, beforr proxies were available.

So it took less time to cut and submit for review than to actually generate the proxy media.

Sure if your workflow had a decent gap between shooting and post then generating proxies is trivial but sometimes a little more storage and memory bandwidth goes a very long way.

steve1977

> The surprising aspect is that they don't use proxies for editing rather than remote desktop.

Who says they're not using proxies and remote desktop?

jauntywundrkind

Sorry, this take is not good.

Yes, attaching many terabytes of video is cheap now.

But scrubbing through that high res raw video isnt (just) size intensive. Its throughput intensive. Size : throughput :: energy density : power density. You can get pretty good all SSD NAS but using a 40Gbps (5GBps, minus overhead) Thunderbolt 4 is still gonna be ok but not stellar. A single desktop SSD can triple that!

I can fully see the desire to remote stream. Being able to AV1 on the fly encode to your local editing station, or even 265, at reduced quality, while still having the full bit depth available for editing sounds divine.

DidYaWipe

What "take" are you talking about?

You're saying Thunderbolt 4 is going to struggle with something, and then touting a desktop SSD as "tripling" TB 4 throughput... but finally declaring that "remote streaming" is somehow better than both of those?

What an absolute crock.

jauntywundrkind

These takes:

> I also don't buy the author's rationale for remote editing; it's oddly archaic

> Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet

Remote streaming is far better. A 2mbps or 20mbps connection to a powerful editing station is awesome. A compressed down h.265 with HDR will still let you edit very well, but be able to do intensive editing tasks with ease.

This really isn't hard at all, the advantages & wins are amazing, remote desktops have been amazing for decades now. I struggle to see how you continue to justify being so far up a creek, other than exhibiting pathology.

varenc

re: Apple not using Final Cut Pro (FCP). I feel like Apple made an intentional decision to abandon the high end production market when they released FCP 10 in 2011. They dropped multicam, XML import/export, etc. I heard they eventually brought most of these features back but seems clear Apple isn't focusing on this part of the market.

DidYaWipe

FCP 7 was garbage, which Apple bought from Macromedia. It was never "high end."

The new FCP could have righted many wrongs, but Apple turned its development over to people who didn't even understand industry-standard terms... and who rejected input from experts Apple had hired years earlier. But that's Apple's standard behavior. They just don't learn.

facile3232

> It is pretty baffling that Apple would create a PR vehicle that impugns its products like this.

I'm struggling to see any of this, frankly. Of course apple uses non-apple software. It'd be pretty weird if they didn't.

All this marketing bullshit reinforces the value of refusing to engage with marketing. What a massive waste of time and effort for all societies and cultures involved.

DidYaWipe

Struggling to see any of what?

facile3232

How this "impugns" apple at all?

kjeldsendk

Avid does have a cloud based solution. This isn't that.

It's a clever way to have your media centralized and yet have access to editors all over the world.

And a modern AVID system does not struggle with a few editors accessing the same footage.

First of all it's usually a proxy format and Secondly the storage can deliver a combined 800MB pr box sustained for x number of editors at the same time.

Yes I avid feel free to ask.

johnklos

Nothing these days "struggle(s) with a few editors accessing the same footage".

AVID hasn't been at the forefront of video editing since the Avid/1 / ABVB days. They sell a reasonably usable program with horrible hardware (since Meridien hardware - it's good they finally let us use other hardware such as BlackMagic), but never truly fix large problems. People therefore stay on a specific version of the software for ages, because everyone is scared of new and different bugs.

AVID's shared media offerings are tenfold the cost of other storage options simply because they have a flag on the mounted volumes that tells Media Composer to allow project and media sharing. "800MB pr box sustained" means nothing because anyone can do that easily with commodity hardware.

In other words, AVID is milking their cash cow and they really don't innovate or even try to offer a good product.

Apple, on the other hand, destroyed their professional editing products, then replaced them with decent tools, but ones that are worlds different. Many people have mixed feelings about this. On the other hand, if you want to edit 8K ProRes, Final Cut Pro makes it simple on any ARM-based Mac.

kjeldsendk

What's your experience based on? Do you work in post production on big projects?

It's their dependency on Blackmagic that's been there biggest problem the past 5 years.

Meridian was light years ahead of the competition. The firewire based adrenaline sucked.

And you won't find anyone complaining about their DX series just to bad they dropped that.

And your really not understanding the way avid nexis works if you think it's just a flag

johnklos

First, facts don't rely on the amount of experience the person sharing them has. But I do get that it's easier to take someone at their word when they have lots of experience, so yes, I've worked on all sorts of projects of all sizes.

I think you've been sold a bunch of ideas. For instance, Avid has no dependency on Blackmagic. They use Open IO, which means you can use any card that supports Open IO, whether Aja, Blackmagic, Bluefish, Matrox, whatever.

Nexus / ISIS isn't special. The flag is literally just a flag that tells Media Composer to enable bin and media sharing. It can be enabled on any kind of sharing - NFS, AFS, SMB, et cetera. For example, check out Mimiq software for enabling it wherever you want.

bob1029

I spent some time a while back thinking about a web-native video editing tool with very lightweight client demands. This came up after watching all those LTT videos about their storage & networking misadventures around the editors. It seems something approximating this (or superior to it) has already been developed.

The way you develop & manage the proxies appears to be the biggest part of the battle in making things go fast. There's no reason for editor workstations to be operating with the full res native material unless theres a targeted reason to do so.

viraptor

LTT is probably not a good/representative example for anything. They'll do infra stunts for content, then it will fail and they'll get content from the failure and content from the new thing. It's in their interest to be slightly on the bleeding edge and slightly janky while having access to subsidised hardware.

And I mean that in a completely positive "it's awesome" way. Just... not the problems anyone else should be facing.

kjeldsendk

Before Covid your idea was the one everyone was pursuing, including AVID with a embarrassing system that i never saw a in a satisfying version.

With Covid remote access became the norm and the online/proxy workflow more or less died. Avid still has a working version (better than the original) but it's widely used.

Proxies are used for several reasons, expensive storage, heavy codecs at high bitrates or multicams.

They are typically avoided whenever you can because the online part of a proxy based workflow can be a challenge. And especially if you have tight deadlines you want all the variables out of the way.

DidYaWipe

That is a pile of contradictory statements. And since you're upset by that idea and unwilling to re-read what you wrote, here's some spoon-feeding:

"With Covid remote access became the norm and the online/proxy workflow more or less died"

No; remote access DEMANDS a proxy workflow, since you're not going to edit full-resolution files over the Internet. So it did not "die;" just the opposite. Witness the entire "camera to cloud" marketing mania that swept NAB a few years ago. That's based entirely on the rapid upload of proxy files to begin editing ASAP.

From NAB last year: “We introduced the [Blackmagic Camera] iPhone app a little while ago,” said Bob Caniglia, director of sales for the company in North America. “You can shoot with that phone, work with the cloud service, share proxies. The camera to Blackmagic cloud to Resolve workflow started with the camera app. The Ursa Broadcast G2 [camera] is now in beta for that software too. That's a good direction on where we're going.”

Does that sound like it died? Or https://blog.frame.io/2024/04/11/visit-us-at-nab-2024/

But back to your assertions: "Proxies are used for several reasons, expensive storage, heavy codecs at high bitrates or multicams. They are typically avoided whenever you can because the online part of a proxy based workflow can be a challenge"

That makes absolutely no sense. You just claimed that proxies are used to avoid "heavy codecs at high bitrates" but then claim "the online part of a proxy based workflow can be a challenge." But you neglected to provide a single example of what's so "challenging" about it, especially when you just cited proxies as an advantage.

Thus, since you pushed the issue, we see that in fact it is you who has no idea what you're talking about. But hey, keep insulting other users.

Or... if you prefer to be informed: https://filmmakermagazine.com/120946-new-remote-tools-workfl...

https://www.tvtechnology.com/news/cameras-support-expanding-...

and many many more...

DidYaWipe

[flagged]

maphale

I Avid too. And manage two sizable (300+ virtualized editors) on-premise VDI systems, and one bigger(somedays 600+) AWS-based one that holds more Adobe than Avid. Remote experience is a bandwidth and latency thing more than anything else, but the technology is limited - for example you can't do a good ProTools system virtualized with a control surface and sync can be a real pain to sort out. As for Avid's solutions to the problem: they do it a couple of ways:

- Composer/Nexis all hosted on Cloud (AWS): fine, but pricy and the Nexis experience is meh

- Composer hosted Cloud/Nexis hosted on Prem: actually works well, but you need to have a direct-connect to AWS (the network can be pricey)

- Composer on on-premise VDI/Nexis hosted on Prem: works really well, and I have a bias towards this instead of fully in cloud for not only security reasons since the TCO is less

- Composer Cloud (or whatever they call it today - used to Composer Sphere): this is a setup where instead you stream real-time proxy to the Composer from MediaCentral. You can download hi-res media if you need to. It works ok, but it more suited for News workflows. Security is a thing with this solution.

- Adobe/OpenDrives on AWS: I mention this, because we do this too. This has all sorts of things to talk about, and is pretty good, but, again, you gotta know what you are doing.

For the on-premise ones, VMWare is our Hypervisor of choice, and, yup, we are looking for other options. And we have all the usual IT problems: domain management, updates, roaming desktops, etc.

If you are looking for 3rd-monitor image viewing (like in the old days with hardware), you can swing NDI or 2110. NDI is ok, and for 2110 you need a network and router to handle it.

john_oshea

The "600+, AWS" detail is great to read, as confirmation that this kind of thind does work. We're urrently setting up remote AWS systems and finding a lot of moving parts for getting smooth playback while editing in AE/PPro.

If you have time to expand on the "bandwith and latency thing", I'd love to hear more. Even a "you need to be geographically within X miles of the instance" ballpark figure would be wonderful to know.

kjeldsendk

During covid I ran a home made ndi solution for remote color correction.

It worked.. Kind a

fragmede

Have you used an Editshare?

kjeldsendk

I actually tried the first version.. back in the day. But even if NEXIS is stupid expensive it's still acceptable if you have the productions for it.

One of the main reasons it's used in larger post houses is the hardware and software support that is world wide with people on site if needed.

LASR

Oh how far we've come.

My home internet is a fiber gigabit 3g/3g up/down. Tucked away under the staircase is where my fiber ONT terminates and it is my server room. I have half a dozen boxes running various things. 4 symmetric 2012 i7 mac minis running linux KVM, and hosting various critical services - pihole, home automation, Homekit Secure Video etc.

Then there a giant former gaming PC with 7 HDD bays running the entire storage backend for a whole load of GoPro/Osmo/Insta360 videos I capture. Rclone to Google Photos for back-up. I don't edit any videos. Just there to capture memories so I can at some point when AI tools get good enough just have it generate clips. Same box runs my plex server with HW transcoding.

Then there is the actual gaming PC, a mini-ITX running steam remote play. Has power, a network cable and a fake HDMI dongle that emulates a monitor to trick the GPU into thinking something is actually plugged in.

Basically everything I do with desktop PCs at home is via some sort of remote interface.

Remote gaming is probably the most demanding of all of these. Low-latency HW-accelerated solutions eg: Parsec / steam-link are incredible technologies.

I carry an AppleTV + PS5 controllers to friends' houses and play the latest games across the internet.

deadbabe

The most impressive thing here is that you physically go to friends houses to play games together.

bombela

what impressed me is the latency low enough to game remotely. This seems unattainable in the bay area.

throwaway314155

Believe it or not, some users here are in fact from other places than the Bay Area. I know, shocking, right?

varenc

what impressed me is the 3gigabit up/down fiber connection. That seems unattainable in San Francisco.

saagarjha

Doesn’t Sonic offer 10 to many homes?

culi

> I carry an AppleTV + PS5 controllers to friends' houses and play the latest games across the internet.

Be honest—you're just playing Factorio

dalanmiller

Do you have a write up on how you get this to work with Apple TV? What you have I consider the dream setup.

pmalynin

I’ve used Moonlight before https://github.com/moonlight-stream

You can just follow that thread no write up needed tbh

throwaway314155

The honest answer is that it doesn't work very well in practice. This is seemingly worsened over Wi-Fi on AppleTV whose Wi-Fi stack constantly interrupts streaming in order to do a variety of things with their "location services".

Moonlight works great (over ethernet at least) locally though.

culi

> at some point when AI tools get good enough just have it generate clips

iPhones already do this today. I'm often surprised how well made they are

abalone

> "In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk... I’m not entirely sure we were supposed to see that, but there it is. Oops."

Sounds like this author didn't watch the whole video. They are completely open about the fact that the editing team collaborated through remoting. At 5:20 an editor specifically says they "remoted into the Mac mini."

The second half of the post raises an arguably good question about the need for fancy Macs when cloud-based workflows only require glorified terminals. But that too may misplaced here -- it's entirely possible that the team members each do local editing work and then host their own collaboration sessions.

citizenpaul

That was a lot of words to reiterate that Apple is a consumer focused company. Not enterprise or B2B.

turtletontine

Bingo. So many decisions made perfect sense once I realized Apple is basically a lifestyle brand that makes electronics, and Microsoft is a massive bureaucratic B2B conglomerate. Totally explained Microsoft’s ineptitude with consumer facing products (remember Windows Phone? Zune?), yet they have a stranglehold on the business world. This is the opposite: Apple is designed for locking individuals into its lifestyle (or ecosystem, if you prefer), and has mostly given up on enterprise facing products.

walrus01

TBH it's still possible to use a macbook air as basically a fancy unix-like workstation that has great battery life, and not buy into any of the apple ecosystem. No icloud account, no icloud backup, no iphone, no use of itunes or appletv, no apple synchronization of anything. The day that stops being viable is the day I stop buying them.

The extent of my 'cloud' involvement with apple is the operating system software update mechanism and having an account to download Xcode, so that I can install compiler + macports on a new machine.

12_throw_away

Heh, it sure would be nice if they made a computer that was explicitly for getting work done (hell, they could call it a "workstation"). I miss the days when big tech still saw a market for this ...

josephg

They do - that’s the point of the Mac Pro. The problem is software. Lots of expensive pcie ports won’t help much when you can’t put a GPU in any of them to use cuda and such.

There’s also so much inefficient, bloated crap that ships with modern macOS that I would never pick it for a proper workstation these days. I have CPU meters in the system tray, and there’s always some stupid process gobbling up all my spare cycles. The other day it was some automatic iPhone backup process. (Why was that using so much cpu, Apple?). Sometimes it’s indexing my hard drive, or looking for faces in photos, or who knows what stupid thing. It’s always something, and its almost always first party software.

In comparison, the cores on my Linux workstation are whisper quiet, and usually idle at 0%. The computer waits for me to give it work.

robocat

I also wish Microsoft would treat developers as a seperate customer segment to market to.

When the people using your tools hate the tools, that isn't a good sign.

mulmen

Microsoft also created the Xbox and every developer I know runs a Macbook.

darthwalsh

I use a MacBook not because it's the best software for development, but because it's the hardest to virtualize.

Our project supports the three major desktop operating systems. I have Windows and Linux VMs that I can switch to when I need to test something on those OS. No serious corporation is going to risks Hackintosh.

dboreham

No MacBook here.

null

[deleted]

scarface_74

The remote computers are still Macs.

mattl

I’d love to know what Apple uses internally for stuff like email and calendaring.

I’m fairly sure they don’t use iCloud which is why some of that stuff is still less than desirable.

We can probably assume that Microsoft uses some kind of Exchange set up and Google will use a version of Gmail.

Whenever I meet with people from Apple, it’s over WebEx.

I heard a rumor that they use some Oracle enterprise groupware, which is presumably https://en.m.wikipedia.org/wiki/Oracle_Beehive

rcarmo

They use Oracle mail servers for their corporate e-mail. Ironically, the direct descendant of the Sun Internet Mail Service software I wrestled with back in the early 2000s.

mattl

Any idea what Oracle’s mail server is called? Is it the thing I linked?

I don’t find it all that surprising:

- Sun/NeXT were doing stuff together before Apple and NeXT merged

- Lots of Java stuff at Apple immediately following the merger including a Cocoa-Java bridge and WebObjects is rewritten in Java

- Oracle/Sun stuff doesn’t need to be run on Windows

- Steve Jobs and Larry Ellison were good friends

alwillis

How long before these new Apple-made servers are available (or a variant) as a backend for video editing?

https://www.apple.com/newsroom/2025/02/apple-will-spend-more...

Opening a New Manufacturing Facility in Houston

As part of its new U.S. investments, Apple will work with manufacturing partners to begin production of servers in Houston later this year. A 250,000-square-foot server manufacturing facility, slated to open in 2026, will create thousands of jobs.

Previously manufactured outside the U.S., the servers that will soon be assembled in Houston play a key role in powering Apple Intelligence, and are the foundation of Private Cloud Compute, which combines powerful AI processing with the most advanced security architecture ever deployed at scale for AI cloud computing. The servers bring together years of R&D by Apple engineers, and deliver the industry-leading security and performance of Apple silicon to the data center.

nickdothutton

There are a number of reasons why the industry centralises. Particularly in post. One of them is the fact that the shot footage is insured and those policies have very strict clauses about handling the material. Yes this applies to an all-digital production as it would have applied to the film era.

kmeisthax

Insurance companies haven't yet grokked "Lots of Copies Keeps Stuff Safe" yet. Unless the insurance is anti-piracy insurance?

Aurornis

Spreading copies around ad-hoc isn’t a backup plan.

They have redundancy and backups.

> Unless the insurance is anti-piracy insurance?

This is a big part of it, actually. Content that leaks prior to launch can reduce revenues significantly. Both from lost viewership due to people already having seen it, and from negative reviews of the unfinished early edits. Many movies change significantly for the better from early cuts.

kmeisthax

The comment I was replying to made it sound like this was insurance purely for recorded footage being destroyed, and not it being leaked. The former is very easy to fix by having everyone's workstations keep a copy of what they're working on[0]. But the more copies you have, the easier it is for the footage to leak. The two risks impose different and conflicting mitigation measures.

[0] Remember that one time Pixar rm -rf'd their server and almost lost Toy Story 2 but for one manager who had a local copy of the project at home?

Andrex

> Spreading copies around ad-hoc isn’t a backup plan.

I wish this were an internet rule we could repeat ad infinitum.

cbozeman

I suspect that's a big reason. Remember about a decade back or so when Fox had four of it's upcoming television shows leaked onto public and private tracker sites about six months before their actual premieres?

Lucifer, Minority Report, Blindspot, and Carmichael were all leaked, and those shows were on different networks, which means it was likely a third-party company that was doing effects in post. I don't recall if it was ever sussed out what exactly happened and now they all got leaked, but it definitely made the industry a bit warier.

hnlmorg

Backing stuff up in the cloud is easy (albeit expensive). Keeping the data from leaking is the hard part.

Most studios are hyper paranoid about their movie or TV show getting leaked.

afro88

The linked promotional materials [0] say that they remote into a mac mini running Avid.

> he works on iMac, which remotes into a separate Mac mini that runs Avid

So the conjecture from the article that the mac mini isn't powerful enough is false

> In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk. Instead, it’s being driven by another Mac on the other side of a speedy internet connection

And based on other comments here, this is a pretty common way to do things.

Why the sensationalism?

[0] https://www.apple.com/newsroom/2025/03/how-the-mind-splittin...

creatonez

> So the conjecture from the article that the mac mini isn't powerful enough is false

Not what the article says... and that doesn't follow anyways. The remote experience was terrible and the non-remote experience wasn't shown at all. How fast the Mac Mini theoretically is doesn't matter at all once you have such an insane bottleneck.

> And based on other comments here, this is a pretty common way to do things.

And? The industry is making a mistake by knee-capping its editors. It's going into seconds-per-frame territory in the video, it's as close to unusable as you can get. The article seemed to definitively prove its case that someone desperately needs to step back and look at what the requirements for these editors actually are, rather than ramming conflicting demands into each other to appease the anti-piracy insurance mobsters.

brcmthrowaway

I would like to use this comment to mention Parsec. It's unbelievable how much snappier it feels compared to the default Screen Sharing. What is their secret sauce?!

I just wish it didn't require an internet connection for authentication

poisonborz

Try Moonlight, similar tech but open/no cloud auth. Works better over local networks though as opposed to internet (which you need to set up via vpn/portforward etc)

xoa

Sunshine/Moonlight are awesome, but fwiw in this specific context it's worth noting that macOS support with Sunshine is still extremely experimental and janky. It's Homebrew only for now, and when I tried it out last the main release didn't install at all, only the beta. And then locally even over a 10 gig network while the image quality was great the latency was abysmal, even before other oddities. I will say this is enormous improvement over even a year ago, but given the initial gaming focused use case I suspect that (not at all unreasonably!) they've prioritized client capabilities when it comes to Macs for now.

jdboyd

Last I looked, they didn't support passing through USB devices like Wacom tablets or edit controllers or space mice. I am eager for that stuff to work so that I can start using moonlight/sunshine for more of my work.

brcmthrowaway

Sadly seems to be NVIDIA/PC? only.

wrigby

This bummed me out, but it looks like it's not? From the Sunshine (server) GitHub page[1]:

  Sunshine is a self-hosted game stream host for Moonlight. Offering low latency, cloud gaming server capabilities with support for AMD, Intel, and Nvidia GPUs for hardware encoding. Software encoding is also available.
1: https://github.com/LizardByte/Sunshine

jimmySixDOF

I have it running clean and crisp hosted off an old i5 SFF HP G2 mini slice with integrated Intel graphics so give it a try!

numpad0

I think it was originally all NVIDIA proprietary, then got reverse engineered OSS client(Moonlight), then got RE'd OSS server(Sunshine).

kjeldsendk

On gpu encoding/decoding of the frame buffer

thomasjudge

Is there a free alternative to Screen Sharing that is more performant? I'm just surprised at the latency and cpu usage of Screen Sharing on my lan. (Mac specific)

brcmthrowaway

NoMachine

rcarmo

Just don't. It is janky and buggy and keeps coming up time and time again, but it is not a real solution.

null

[deleted]

null

[deleted]

TZubiri

Is it really just authentication? I thought the whole screen data was passed through an intermediate server, but I can see how a peer 2 peer system would be more efficient. I can't imagine the wonky NAT hacks that need to take place though.

TZubiri

How would it not require an internet connection lmao, it's a remote connection tool

kjeldsendk

I think op meant cloud based in the sense you have to create a user account on their site and everything goes through that.

xoa

>How would it not require an internet connection lmao, it's a remote connection tool

I'm kinda surprised you've managed to be on HN for 5 years and never come across the concept of a "LAN" or "VPN" before, but I guess you're one of today's lucky 10000. To the first, sometimes you have machines (or VMs) local to your own network but in another physical location that you'd like to be able to access from your own system. It's a fairly significant use case, and one where no internet connection is involved whatsoever. For example it's generally desirable to locate powerful (and in turn generally loud) servers and associated gear (including environmental control, redundant power etc) in physically isolated locations from where the humans are working for noise reasons if nothing else, though security and efficiency are important as well. While it's possible to pipe raw video over IP, a quality remote desktop solution will generally be more flexible/scalable and doesn't require special (expensive) extra hardware and potentially additional fiber.

And for systems located on other LANs remote from your own, you can use a VPN to link them securely as if they had a direct physical (though higher latency/more jittery) link, again avoiding any exposure to the public net. That then reduces to the above. In both cases it's desirable to have zero unnecessary 3rd party dependencies.

ibeff

> I'm kinda surprised you've managed to be on HN for 5 years and never come across the concept of a "LAN" or "VPN" before

Unnecessary snark.

TZubiri

Ah you got me I guess, didn't think of the VPN case. It does seem like an asterisk in the grand scheme, especially since the applicability of this tech in LANs is very limited (there's no lag in LAN, and it's already internet in the sense that it uses IP, you would need to consider an ethernet framing tool or a unix socket tool like X11 for truly local-remote protocol), so this would only useful in this network-virtualized VPN ecosystem (and also in scenarios where you want to ensure no third party handles the data, by self hosting the server part of parsec)

What is clear to me is that Parsec belongs to a newer breed of remote tools, inline with TeamViewer and AnyDesk, that primarily respond to the need of post ISP firewall era, where by default ports are blocked, so peerless remote tooling becomes harder to install and administer, these have a client-server-client based architecture. And Parsec builds upon this architecture by placing some secret lag reducing sauce on their Server instead of just authenticating and forwarding.

My guess is that they have a proprietary predictive and interpolation based OS algorithm tightly coupled to the OS UI, and this secret sauce lives and is closed source on their backend, so you would kind of need to host a third server in the middle, maybe we will see a competitor for a VPN niche, or an open source alternative.

If an open source solution arises, I bet that it would require an installation of a server, and it would probably start with X11 or Wayland tight coupling.

hackinthebochs

Remote desktop between computers on a local network

TZubiri

I guess, but I thought the selling point of parsec is that it reduces lag (or hides it).

This would make sense in VPN environments though.

Syzygies

Cloudless Fluid requires a Teams Enterprise subscription. Or one can manually enter IP addresses. Their default is cloud mediation, so yes, they presume a working internet connection.

jpalomaki

Two "Remote Desktop" tools mentioned in the article 1)Jump desktop 2)Parsec.

[1] https://www.jumpdesktop.com [2] https://parsec.app

impoppy

Video editing is not as portable as coding, there ain't no git. It doesn't surprise me that they have to do that, I imagine it's simply speedier and comfier to connect to a desktop that already has the work in progress in the latest state instead of ensuring everything is synced on different devices one uses. I also imagine that beefy MBPs with M3 and upwards could handle 4K editing of Severance (or maybe 8K) and they'd edit on local machines, should it be actually more convenient than connecting to a remote desktop. It's a bit shameful to admit, but still something we have to deal with while having such crazy advances in technology.

jiggawatts

In principle a good editing tool could use Git for the edit operations (mere kilobytes!) and use multi-resolution video that can be streamed and cached locally on demand.

Uehreka

When I got into projection design I tried using git to keep track of my VFX workspace. After typing `git init` I heard a sharp knock at my apartment door. I opened it to find an exhausted man shaking his head. He said one word, “No.” and then walked away.

Undeterred by this ominous warning, I proceeded to create the git repo anyway and my computer immediately exploded. I have since learned that this was actually the best possible outcome of this reckless action.

impoppy

All jokes aside, it's too big of a pain in the ass to have that stuff version controlled. Those file formats weren't meant to be version controlled. If there's persistent Ctrl-Z that's good enough and that's the only thing non technical people expect to have. Software should be empathetic and the most empathetic way to have the project available everywhere is either give people a remote machine they can connect to or somehow share the same editor state across all machines without any extra steps.