Streaming Media

Streaming Media on Facebook Streaming Media on Twitter Streaming Media on LinkedIn
 

Adobe and Editors' Content: What's Mine Is Mine, and I Mean It

As I write this, Adobe is currently navigating some interesting content ownership & usage issues.

As I write this, Adobe is currently navigating some interesting content ownership & usage issues.

On June 5th, 2024, users of various Adobe apps were faced with a pop-up notice "we may access your content through both manual and automated methods, such as for content review." The ONLY way forward was to agree to it. There was no "opt-out." It was assumed that your content was included. That's the wrong way to do things.

You were lucky to be told at all because, as Mashable noted: "The updated section in Adobe's Terms of Service, actually, quietly went into effect all the way back on February 17, 2024. It says, "Our automated systems may analyze your Content and Creative Cloud Customer Fonts (defined in section 3.10...) using techniques such as machine learning in order to improve our Services and Software and the user experience." The language is deliberately vague. But the specific mention of "your content," and using "machine learning in order to improve our Services and Software," immediately drew concerns that users' creative work would be used as training data for Adobe's AI tools."

http://mashable.com/article/adobe-users-outaged-new-policy-trains-ai-their-work

The worst part, to me, is Adobe's assumption that any and all user content can be used by Adobe without credit or compensation, regardless of "for training" or whatever other reason they want to use. Where was the big "caution" sign asking you if this was okay in the first place? It really looks like everyone was quietly included, and you have to hunt it down, and manually opt out - which is absolutely the wrong way to do things.

Grummz on X noted: "Summary of rights Adobe is asserting over YOUR work:
1. They can review it for Content Moderation, 2. They can manually review it (humans), 3. They can publish or give it away for free, 4. They can sell it to 3rd parties (sublicense), 5. They can and will use it to train their AI.
You can't fully opt-out of it even if files are local, like for content aware fill and other non-specified tools."

http://twitter.com/Grummz/status/1798721085556560222

It's one thing to use Facebook knowing what you willingly post, Facebook will use to keep others engaged on the platform so Facebook can sell ads. It's another thing entirely for Adobe to charge users to use a tool, which we long-considered private, to now realize they've slowly been moving the ownership legalease and policies to include Adobe's ability to use your content, without compensation, regardless of reason.

Some have postulated that it's only content you upload to their servers (which should still be private, it shouldn't belong to Adobe, Dropbox, iCloud, etc.) and also noted that you can "Opt Out" of content analysis. But Adobe legalese actually says, "Turning off content analysis doesn't affect our ability to analyze your content when you participate in programs ... If you don't want your content to be used for such purposes, you should avoid participating in those programs, including, but not limited to..." Where Adobe lists four programs, but admits it's "NOT limited to" those four programs.  Also, "if you use features that rely on content analysis techniques (for example, Content-Aware Fill in Photoshop), your content [will] still be analyzed when you use those features..."

http://helpx.adobe.com/manage-account/using/machine-learning-faq.html

The online uproar from users led Scott Belsky, Adobe’s Chief Strategy Officer and Executive Vice President of Design and Emerging Products, and Dana Rao, Adobe’s Executive Vice President, General Counsel, and Chief Trust Officer to post a response on the Adobe blog in June 10th, saying in part, "“We recently rolled out a re-acceptance of our Terms of Use which has led to concerns about what these terms are and what they mean to our customers. This has caused us to reflect on the language we use in our Terms, and the opportunity we have to be clearer and address the concerns raised by the community."

http://blog.adobe.com/en/publish/2024/06/10/updating-adobes-terms-of-use

Those are the two people who should have had that ironed out BEFORE rolling out the "re-acceptance of Adobe terms." (Maybe they did!) In fact, while promising that, "We’ve never trained generative AI on customer content, taken ownership of a customer’s work, or allowed access to customer content beyond legal requirements." it begs the question - you say you haven't done it, so why did you deliberately craft the legal Terms of Use to leave a lot of grey area? If Adobe truly plans to honor this, why doesn't the legalese match? If it can be said so clearly in a blog post, why didn't the Terms of Use say it that way in the first place?

Why isn't there a big notice describing uses and asking us to opt-in for any potential company use of our content. Even generative fill should let us know; each time! Why do we have to find an article to track down a preference to "opt-out," if we never opted in, in the first place. It's because companies are already stepping over the content ownership line and hoping you won't notice.

I feel this is just a harbinger of things to come. All sorts of companies are desperate for content they can leverage for their engines and tools. In many cases they have to pay to license images or absorb huge libraries of public content, sometimes still getting caught incorporating copyrighted work. But Adobe, and I'm sure others will follow, has found a new solution, not only do have you pay to use their tools, but their actual Terms of Use establish their right to use your content, without compensation - regardless of what they post on a blog.

As for me, I'm a long-time Premiere user (version 2, 3, 4 back in the 90's. I'm now going to learn to use a new non-linear edit tool, like ShotCut, or Resolve, Vegas Pro, Lightworks, Final Cut Pro, Media Composer, etc. There's no shortage of alternative tools that (currently) don't claim ownership of your work for their needs. However, I see that I'll need to pay attention, because that other tool might be the next one after my content. I'm sure this will not be over soon, as each of the tools needs to keep pace with the others. We'll all need to be extremely vigilant of all our tools and services any more, if you want to keep what's yours, just yours.

Related Articles
As the march of new tech accelerates, it begs the question: Do we wait a bit to see what new gear can offer, or do we go with what's available now, even if might seem a bit older
Anthony Burokas introduces streaming producers to a brand new BirdDog in the new X1 PTZ cam, which departs from past models with a new design featuring a Wi-Fi antenna, Halo Tally, AI Tracking, an industry-first e-ink display for confidence monitoring.
A lot of products tagged as "AI-powered" were doing exactly the same thing before AI. But labeling their familiar features as "powered-by-AI" makes the same tech seem fresh and innovative.
NDI runs the risk of fumbling the ball on the 20-yard line—squandering the nearly decade lead they have in IP video by not enforcing that licensees integrate all of the standard. I write this in the hope that they re-establish what the NDI standard means: that to license NDI, and display the NDI badge, a product must be 100% compatible.
 
" class="hidden">1518手机号码吉凶查询网站