Evolution 2 was pretty good. Much better than the first game.
Evolution 2 was pretty good. Much better than the first game.


Only if the site they’re visiting isn’t using HSTS, but it’s possible
As much as I dislike UniFi, it is true that there isn’t really a competitor in their tier of equipment. Upgrading would be a good choice, but if you end up choosing to stay with UniFi, I’d highly recommend the UCGs.
The UDR and UDM line are so horrifically poorly designed it’s frankly astonishing they ever left the drawing board. Not only is the shape unwieldy and awkward enough on its own, the thermals on those things are terrible. I used to wake up to no internet every few weeks because my UDM would overheat and would be unable to boot until you cooled it down. I learned to just stick it in the fridge for a bit to cool it down so it could boot again.
The UniFi UCG ultra and the Max are far better machines, capable of doing just as much if not more than the Pill shaped routers while staying cool and not sounding like a jet engine every time you send a text over wifi. The only thing you even lose is the integrated Access Point which sucks compared to the discrete ones anyway.
This will happen automatically if you buy enough and start experiencing their true hardware failure rate lol.


Ngl that is a stupid ass name lol


You can definitely turn Apple Intelligence off


A separate box with apps that work better and just use the one remote.


Yeah “changing”, sure. But “evolving” implies things are changing for the ”better”, which they are not.


Reading between the lines I feel like when you say “Targeted towards self hosters” what you mean is “John Q Hobbyist who doesn’t know any better”
And in response to that I would contend that Gitea is not actually targeted at those folks, though they obviously use it. Gitea is FOSS but it’s still “targeted” at professionals.


It was not


Yeah but Umami is an analytics engine powered by client side tracking. If it was behind a VPN it would be useless.


I don’t know about “all umami instances being infected” but they were certainly all vulnerable.


I’m so glad my medieval loving brethren are getting their so long awaited game.
But the bad news for me is this means probably yet another decade of trying to finish a game of Empire before the crashing brings my campaigns to a halt lol


If random encounters go down as CPUs get faster, my CPU is so much faster than one from the 90s that my random encounters should approach zero, but I had plenty.
I mean some napkin math and averages would tell me that your base clock speed is roughly 8 times faster than the fastest computers they would have tested on. Is 8 times faster truly enough to bring the random event rate to “near zero”? Problably not. And with an old game like this it’s not as easy as just comparing clock speeds because it depends on which CPU you have, do you have Ecores? If so is your computer scheduling it on those or your p cores? And in either case is it using base clock speed or boost clock speed? How do your drivers fit into all this?
There’s also the fact that while the encounter rate is tied to CPU speed it’s not a 1:1 relationship either. The encounter system also factors in tiles, and in game days.
that they built and tested the game on higher end machines than many of their customers had, and that faster CPUs resulted in the correct encounter rate while slower CPUs resulted in dozens.
Like I’ve already said, they accounted for lower CPU clocks at the time. They designed the encounter rate for clock speeds between 200mhz and 450-500mhz, the whole range for the time. You’re also acting like fallout 1 wasn’t a cheap side project half made for free by people working off company hours. It wasn’t some big budget release. Or as if Fallout 2 wasn’t an incredibly rushed game shoved out the door by a financially failing company.
I’d sooner believe that the game working differently at different clock rates was an oversight rather than how they intended for it to work.
It was neither. It was simply an engine limitation they had to account for best they could because the first two games were functionally just tabletop RPGs under the hood that ran on a modified version of GURPS and relied on dice rolls for practically everything. As with anything else in life they designed around the problems they encountered at the time, not some hypothetical distant future scenario they’d have no way to predict.


cap on encounter rates, why do they all appear to be at about the rate I experienced?
Well it’s clearly not a cap if you’re seeing people having more frequent encounters than you are.
And why would we not assume that that cap was the intended design?
Because they tied the encounter system to CPU frequency and the highest consumer CPU frequency at the time was like 500mhz. Why on earth would you assume that the developers designed the rate not around what hardware was capable of at the time, but what would be capable 15 years later?
You’re suggesting that the developers got into a room together and said “Let’s design this so that it won’t play the way we intend for it to be played until 15 years pass”


If we ignore the part where that person had so many encounters that they came to the conclusion that something was wrong
I wouldn’t ignore it at all, in fact, what they might even be experiencing is the games intended encounter rate which as I told you, is much higher than you think it is. A lot of modern Intel CPUs, especially in laptops, have efficiency cores besides their performance cores, and sometimes have insanely low base clock speeds, we’re talking as low as 200mhz. Given the games age, it’s very possible the game was scheduled on an E core and also wouldn’t boost the clock speed, resulting in the behavior they describe.
if we ignore the distinct possibility that people remembering a higher encounter rate could have been experiencing that due to their CPU spec not being what the developer intended even in the 90s
That’s not a possibility. The developers specifically designed the system with lower spec systems of the time in mind. They actually designed it in such a way that the encounter rate would be reasonable compared to their idea rate on systems with clock speeds as low as 200mhz (Just like our friend above).
Now that user will be experiencing more encounters than even the average player in the 90’s, but it still wouldn’t be outside of the realm of what the devs decided was intended.


Nope, the opposite. From your casual search:
playing unpatched vanilla Fallout 2 will likely REDUCE the number of random encounters (and the time you spend on the map screen, lic) because the game originally tied the travel rate to your hardware.
There’s a reason why most fan restoration patches include logic to increase the number of encounters, to make the game play more like it was when released.
The reason is because they tied to travel system to clock speeds, and modern processors cause your travel speed to be too fast which the random encounter timing system doesn’t account for. People were complaining about this 15 years ago, the problem only would have gotten worse since then.
The GOG versions do not include any fixes for the encounter system.


Your doubt isn’t a factor, it’s just how the game works. Unless both 10 years ago and 1 year ago you replayed them on a computer from the late 90’s, you didn’t get as many random events as were intended. The very fact that you think random events were such a small part of those games also confirms you weren’t getting as many as you were supposed to lol.
It is not and as somebody who was patient with Civ VI and ultimately loved it after it was fleshed out, I don’t think it ever will be. The “play three different civs over the course of each game with a leader unrelated to any of them” thing they stole from humankind is not going away so if you’re not a fan of that you’re just going to have to skip this one.