Has this impacted your self hosted instances of Immich? Are you hosting Immich via subdomain?
Related:
Google is dangerous.
Deadly to their margins by 0.000000000000000000000000000000000001%
Was also flagged recently.
In my case it was the root domain which is- Geofiltered to only my own Country in Cloudflare
- Geofiltered to only my country in my firewall
- Protected by Authelia (except the root domain which says 404 when accessing)
So…IDK what they want from me :p My domain doesnt serve public websites (like a blog) destined for public consumption…
Is it available for the public to get to? Yes, so that’s why.
Stop using google. Don’t you know their motto? “Be evil”
OP is impacted by Google SafeBrowsing which various websites use.
Easier said than done, if your end users run Chrome. Because Chrome will automatically block your site if you’re on double secret probation.
The phishing flag usually happens because you have the Username, Password, Log In, and SSO button all on the same screen. Google wants you to have the Username field, the Log In button, and any SSO stuff on one page. Then if you input a username and go to start a password login, Google expects the SSO to disappear and be replaced by the vanilla Log In button. If you simply have all of the fields and buttons on one page, Google flags it as a phishing attempt. Like I guess they expect you to try and steal users’ Google passwords if you have a password field on the same page as a “Sign in with Google” button.
Firefox ingests Google SafeBrowsing lists.
If you are falsely flagged as phishing (like I was), then you are fucked regardless of what you use (except you use curl).I couldnt even bypass the safebrowse warning on my Android phone in Firefox.
Google, protecting you from privacy
Google protecting Google from FOSS.
They’re right too, after using Immich I don’t want to go back.
Also protecting you from the East Wing of the White House.
Why are the immich teams internal deployments available to anyone on the open web? If you go to one of their links, like they provide in the article, they have an invalid SSL certificate, which google rightly flags as being a security risk, warns you about it, and stops you from going there without manual intervention. This is standard behaviour and no-one should want google to stop doing this.
I was going to install linux on an old NUC to run immich some time soon, but think I might have to have a look to see if it has been audited by some legit security companies first. How do they not see this issue of their own doing?
It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
If you don’t have them on the open web, developers and pull request authors can’t see the previews.
The issue they had was being marked as phishing, not the SSL certificate warning page.
It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
So? What that has to do with SSL certificates? Do you think GitHub loses SSL when viewing PRs?
If you don’t have them on the open web, developers and pull request authors can’t see the previews.
You can have them in the open, but without SSL you can’t be sure what you’re accessing, i.e. it’s trivial to make a malicious site to take it’s place an MitM whoever tries to access the real one.
The issue they had was being marked as phishing, not the SSL certificate warning page.
Yes, a website without SSL is very likely a phishing attack, it means someone might be impersonating the real website and so it shouldn’t be trusted. Even if by a fluke of chance you hit the right site, all of your communication with it is unencrypted, so anyone in the path can see it clearly.
Yes, a website without SSL is very likely a phishing attack, it means someone might be impersonating the real website and so it shouldn’t be trusted. Even if by a fluke of chance you hit the right site, all of your communication with it is unencrypted, so anyone in the path can see it clearly.
No, Google has hit me with this multiple times for sub domains where the subdomain is the name of the product and has a login page.
So, for example, if I have emby running at emby.domain.com they’ll mark it as a phishing site. You have to add your domain to their web console and dispute the finding which is probably automated. I’ve had to do this at least three times now.
All my certs were valid.
Yes, Google has miss reported my websites in the past, all of which were valid, but the person I’m replying to seemed to assume no-SSL is a requirement of the feature, and he doesn’t understand that a wrong/missing SSL is indistinguishable from a Phishing attack, and that the SSL error page is the one that warns you about phishing (with reason).
The issue they had was being marked as phishing, not the SSL certificate warning page.
Have you seen what browsers say when you have a look at the SSL certificate warning page?
It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
Why is a user made PR publishing a branch to Immich’s domain for the user to see?
I thought that was how pull requests worked, its a branch if you’veade a departure to edit code, you have the pull request and ask them to merge into the main branch. It should be visible to everyone so everyone can review the change.
You could just host it inside your network and do an always on VPN. That’s what I do.
Now imagine you’re running a successful open source project developed in the open, where it’s expected that people outside your core team review and comment on changes.
How would that work? The use case is for previews for pull requests. Somebody submits a change to the website. This creates a preview domain that reviewers and authors can see their proposed changes in a clean environment.
CloudFlare pages gives this behavior out of the box.
Ah, I missed that part
Immich users flag Google sites as dangerous
I smell fear.
I knew it was too good to be true when they give away free pic storage for their pixel phones. I just didn’t listen to my gut.
Google
I have identified the problem.
The URLs mentioned in their blog article all have a wrong certificate (different host name).
I am sure if they fix it Google’s system would reclassify the sites as safe.
Yeah, sure, 5 years after google flagged one of the sites i hosted, some firewalls (including isp-level blocks) mark the domain as unsafe. Google removed the block after more than a week but the stink continues until today.
It was also a development domain and we were forced to change it.
I think that marking things as “safe” could have more complications than this depending on their definition but I think you’re right that’s probably all this issue is. This is almost the only sane comment here. Everyone else seems to be frothing at the mouth and I’m guessing its a decent mix of not understanding much of how these systems work (and blindly running tutorials for those that do self host) and blind ideology (big companies are bad / any practice that restricts my personal freedom in any way is bad)
any practice that restricts my personal freedom in any way is bad
Yes? I don’t want to live in a world where giant companies decide what I can and cannot see. And big companies are bad, they act as pseudo governments that aren’t accountable to anyone, we used to break them apart before they started buying up politicians and political power.
Agreed after the yes.
I’m not sure how what you said either: justifies the comments not fitting that label; justifies that “any practice that restricts my personal freedom in any way is bad” is a practical ideology; or even establishes much a link between what you’ve quoted and what you’ve said. And I think you need to be doing one of those to be making a counter argument
I don’t blame people for thinking that something is off after reading the linked blog post. This wouldn’t be the first time Google does something like this to OSS that poses some kind of potential threat to their business model (this is also mentioned in the post).
They’ve also started warning against android apps from outside repos. Basically they want to force people to use their ai-filled bullshit apps.
jellyfin had a similar issue too for a long time for servers exposed to the internet. google would always reblock the domains soon after unblocking them. I think they solved it in the latest update. Basically it’s that google’s scraping bots think that all jellyfin servers are a scam that imitate a “real” website.
But the malvertisements on Google’s front page are ok, I guess
What is the usecase for exposing jellyfin to the outernet anyway ?
watching it remotely, like at friends. even if you can access it on your phone through VPN, the smart TV won’t be able to use it
What’s the usecase for Netflix? Same case.
Google marks half the apps on my phone as dangerous. Google are evil xxxxxx’s
this is why you disable google “safe browsing” in librewolf and use badblock instead
Librewolf has Google “safe browsing” to disable…? Google?
firefox has google safe browser api protection; librewolf disables it by default under librewolf settings.
https://support.mozilla.org/en-US/kb/safe-browsing-firefox-focus
Thanks for sharing this nice blocklist :)
Same when you try to deviate from the approved path of email providers or, dog forbid, even self-host email.
This is why I always switch off that “block potentially dangerous sites” setting in my browser - it means Google’s blacklists. This is how Google influences the web beyond its own products.
edit: it’s much more complex than simple blocklists with email
I wouldn’t recommend turning off safe browsing
If a page is blocked it is very easy to bypass. However, the warning page will make you take a step back.
For instance, someone could create a fake Lemmy instance at fedit.org to harvest credentials.
just use ublock origin and a proper password manager. google safe browsing means google sees what sites you browse.
@possiblylinux127 @A_norny_mousse ungoogled-chromium disables safe browsing, and for Debian’s chromium package I keep going back and forth about whether to pull that patch in or not.
@Andres4NY@social.ridetrans.it
Running Debian Stable, I have installed ungoogled-chromium which is also in the repos.
But Librewolf is my main browser, Chromium a rarely used secondary.
What I’m talking about is how these blocklists are used by many other browsers/softwares (e.g. Firefox) as well.
This is why I always don’t use Chrome or Google Search














