The Short Version
- Every delivery platform has a content license buried in its Terms of Service — most PROs have never read theirs
- Some platforms use uploaded content for AI training, and the license survives even after you delete your files
- Some guest-facing tools collect biometric data (faceprints) from wedding photos without explicit consent
- WeddingFilmHub claims no rights on any content — PRO work, couple uploads, and guest photos are never used beyond displaying them in the app
Every platform you upload to has a Terms of Service. It is several thousand words long, last updated on a date you do not remember, and somewhere in it there is a sentence about what that platform is allowed to do with the content you store on it.
Most people click through it. I did too, for a long time.
What a content license actually says
The reasonable version — the one every platform includes — reads roughly like this: by uploading content, you grant us a license to host it, serve it to your clients, and display it in a browser. That part makes sense. They need those rights to operate the service.
The version that is harder to justify is a secondary clause that follows the first — not the one at the top where they describe the service, but a separate paragraph buried further down. At least one major gallery platform used by wedding photographers grants itself rights to use uploaded content for any purpose, royalty-free, in perpetuity. The phrase "for any purpose" does a lot of work. It is not specific to running the service. It has no carve-out for AI training. It does not expire.
Most users have clicked through it without noticing. Theirs did too.
The AI training clause
Some platforms are more direct about it. A handful — mostly the ones built around editing tools and AI culling — explicitly state in their terms that photos uploaded to the platform are used to train their AI models. One of them goes further: it acknowledges that this training persists even after you delete your account. You can close the account. You can delete every photo. The model that trained on them stays, and the influence does not get rolled back.
That is a defensible product decision if you know about it and agreed to it. Most people did not know about it.
The photos in question are not stock images or test uploads. They are your clients' wedding day — the ceremony, the getting ready, the first dance, the grandmother crying in the back row. Those images, sitting in the platform's storage, may be doing work you never intended for them.
Biometric data at the wedding
This is the one that I keep coming back to.
There is a category of event photo apps — not the delivery platforms, but the guest-facing ones — that have built their experience around facial recognition. The mechanic works like this: guests submit a selfie when they arrive. The AI scans all the event photos, matches faces, and sends each guest a personalized gallery of photos where they appear.
It is genuinely useful. It solves a real problem — nobody wants to scroll through 800 photos to find the twelve that include them.
What it also does is collect biometric data — faceprints — from every guest at the wedding. At least one app in this category explicitly lists "biometric identifiers and biometric information including faceprints" in its privacy policy as data it collects and retains. A separate major platform — not a wedding-specific one, but one widely used for personal photos — settled a $100 million class action lawsuit related to biometric data collection from uploaded images.
The guests at that wedding did not read a privacy policy. They were handed a card with a QR code, scanned it, and submitted a selfie because it seemed like a fun thing to do at a wedding. They had no idea they were handing over biometric data to a company they had never heard of, stored for a period they were not told, under terms they never saw.
Neither did the couple who chose the app. Neither did the photographer who recommended it.
I use AI constantly
I want to be clear about where I stand, because this can read as an argument against AI and it is not.
I use AI every day — for writing, for development, for thinking through hard problems. The tools are genuinely good and I am not pretending otherwise. The team that built this platform uses AI throughout the process.
The argument I am making is narrower than "AI is bad." The argument is: the people in those wedding photos — your clients, their guests, their families — did not consent to being training data. They consented to being in a wedding gallery. Those are not the same thing.
One is a memory. The other is a resource.
Where we land on this
Wedding Memory's position on content rights is in the legal documents, not in marketing copy. We claim no rights over the content stored on the platform — not the professional films, not the guest photos, not the guestbook messages. Nothing is licensed for any purpose beyond displaying it in the wedding experience. No biometric data is collected from guests. Nothing trains a model.
It is written that way because we believe it. And because we know what the alternative looks like — we read the same terms you should probably read before you upload your next wedding.
If you want to compare terms, the legal documents for Wedding Memory are at wedding-memory.com/legal. They are shorter than most, and written in plain language. That was intentional.