• 0 Posts
  • 19 Comments
Joined 11 months ago
cake
Cake day: August 4th, 2023

help-circle

  • As an introvert, as much as I feel weird aroind people, I feel even weirder video chatting with people I’ve never met in person. In that situation, I have no idea how to read people and the expectations are way harder to try to meet. This makes meetings even worse until I meet them.

    While I agree that forced in person work daily is insane, the OP is complaining about meeting people in person once after many years, which feels equally as ridiculous. IMO even for widely dispersed teams, meeting a few times a year seems ideal.



  • I’m actually shocked to find how many people agree with the OPs sentiment, but maybe there’s something about the demographics of who’s using a FOSS Reddit alternative or something. I’m not saying everyone is wrong or has something wrong with them or whatever, but I entirely agree with people finding this valuable, so maybe I can answer the OPs question here.

    I’ve been working remotely long since before the pandemic. I’ve worked remotely for multiple companies and in different environments. I am extremely introverted and arguably anti social. I tend to want to hang out with many of my friends online over in person. But that doesn’t mean I think there’s no advantage at all. To be honest, when I first started remote work, I thought the in person thing was total bullshit. After a few meetings my opinions drastically changed.

    I’ve pushed (with other employees, of course) to get remote employees flown in at least a few times a year at multiple companies. There are vastly different social dynamics in person than over video. Honestly, I don’t understand how people feel otherwise, especially if they’ve experienced it. I’ve worked with many remote employees over the years and asked about this, and most people have agreed with me. Many of these people are also introverted.

    I think one of the big things here is people harping on the “face” thing. Humans communicate in large part through body language - it’s not just faces. There’s also a lot of communication in microexpressions that aren’t always captured by compressed, badly lit video. So much of communication just isn’t captured in video.

    Secondly, in my experience, online meetings are extremely transactional. You meet at the scheduled time, you talk about the thing, then you close the meeting and move on. In person, people slowly mosy over to meetings. And after the meeting ends, they tend to hang around a bit and chat. When you’re working in an office, you tend to grab lunch with people. Or bump into them by the kitchen. There’s a TON more socializing happening in person where you actually bump into other people and talk them as people and not just cogs in the machine to get your work done.

    I find in person interactions drastically change my relationships with people. Some people come off entirely different online and it’s not until meeting them in person that I really feel like I know them. And then I understand their issues and blockers or miscommunications better and feel more understanding of their experiences.

    Maybe things are different if you work jobs with less interdepencies or are more solo. I’ve always worked jobs that take a lot of cooperation between multiple different people in different roles. And those relationships are just way more functional with people I’ve met and have a real relationship with. And that comes from things that just don’t happen online.

    Im honestly really curious how anyone could feel differently. The other comments just seem mad at being required to and stating the same stuff happens online, but it just doesn’t. I do wonder if maybe it has to do with being younger and entering the workplace more online or something. But I’ve worked with hundreds of remote employees and never heard a single one say the in person stuff to be useless. And I’ve heard many say exactly the opposite.



  • Not to mention, Amazon already owns multiple online video services with Prime Video and Twitch. The intersection of those two already cover a bunch of those bases, so you’re talking about standing up an entirely new unprofitable service that needs it’s own monolithic infrastructure and will end up competing with your own services, in order to try to take a third slice away from YouTube.

    It’s just no where near worth it. If you think Amazon has any business competing with YouTube, you don’t understand A) how the market works B) how much of a technical undertaking that is C) how much lift it would take to get a reasonable number of creators to keep the platform active D) how financially unviable the product is. Even one of those on its own would be a serious dissuasion from doing so, but there are many reasons not to do this.







  • So, I gather that what happened was iPhones and changes to coding languages (HTML5) which didn’t require an extra on the system (a plug in) to do it’s thing.

    … Sort of. That’s a bit of an oversimplification and iPhone-centric, but generally the right idea.

    I’d slightly shift this and say it’s more that flash and Java had many known problems and were janky solutions to the limits of HTML of the day. They were continued to be supported by browsers because they were needed for certain tasks beyond games that were actually important. Games were just a secondary thing that were allowed to exist because the tech was there for other problems.

    At the time, more “serious” games were mostly local installs outside your browser, and browser games were more “casual” and for the less technically inclined general audience. The main exception here was Runescape, and a couple others like Wizard 101 etc.

    But then smartphones started becoming more popular, and they just could not run flash/Java effectively. They were inefficient from a performance standpoint, and smartphones were very behind in performance and it just didn’t work well. In the early days, many Android phones would run bits of flash/Java, sometimes requiring custom browsers, but it just wasn’t very performant.

    Then HTML5 came along, solved most of the gaps in existing HTML tech, and the need for flash and Java greatly decreased. Because of the performance problems and security vulnerabilities, the industry as a whole basically gave up on them. There was no need beyond supporting games, as the functional shortcomings were covered, and HTML5 did somewhat support the same game tech, but it would take massive rewrites to get back there and there was basically no tooling. Adobe had spent over a decade building different Flash tools and people were being dumped to lower level tech with zero years of tooling development. Then came WebGL and some other tech… But nothing really made a good grip on the market.

    Unity and some other projects allowed easier compilation to HTML5 and WebGL over the years, so this was definitely still possible but simultaneously the interest was plummeting so there wasn’t much point.

    Much of the popularity of web based games back in their day was you could just tell someone a URL and they could go play it on their home computer. Their allure was their accessibility, not the tech. The desire for high tech games was won over by standalone desktop games. But those were harder to find, required going to a store, making a purchase, bringing a CD home, installing said game, having the hardware to run it, etc.

    But at the time of the death of Flash and Java, everyone carried a smartphone. They all had app stores and could just search the app store once, install the game, and have it easily accessible on their device, running at native performance. Console gaming had become commonplace. PC gaming was fairly common, with pre-built gaming PCs being a thing. Now Steam existed and you didn’t have to go to a store or understand install processes. Every competing tech to web games was way more accessible. Smartphone tech better covered “gaming for the general populace”.

    What would be the point of a web game at that point? Fewer people have desktops so your market is smaller. If you’re aiming for people’s smartphones, doing stuff natively to two platforms is higher performance and easier to deal with. Console gaming is more common. PC gaming is a stable market. OK top of that, there’s way less money in web based gaming. Stores like steam and console game stores have the expectation of spending money and an easy way to do so. Smartphones have native IAP support to make it easy to spend money on microtransactions. Web has… Enter your payment information into that websites payment processor they have to integrate, which feels less safe to the user and requires more work from the developer than the alternatives on console/pc/mobile.

    There’s just no market for web based gaming anymore when people have so many more options available that are easy to access - what’s the purpose of building a web based game at that point?



  • Musk said he made the decision fearing that Moscow would retaliate with nuclear weapons.

    I feel like this part is even worse. His opinion sucks and is fucking stupid, but he’s literally saying he’s making decisions (which have an impact on thousands of lives) because of his speculation on the Russian response.

    He’s not a fucking general, this shit shouldn’t be his decision. He is not informed or educated on these decisions and he’s playing with people’s literal lives. He’s literally trying to play god with his space toys.


  • they have made a pretty good effort to patch Pegasus vulnerabilities whenever they come about,

    I mean, they kind of have to? What’s the alternative, they leave it? Why are we applauding them for basically the bare minimum here?

    Apple’s investment in discovering these problems seems pretty poor. There are multiple instances of Google finding exploits for them and then Apple downplays and complains about Google being too alarmist.

    Sure, they fix things. But they fucking better, or there’s a very different problem. But their proactive investments in trying to discover them ahead of time seems pathetic.


  • Yeah, the argument that there’s money in this business only furthers the point here - there’s money in it because it’s valuable to abuse systems. Therefore the people running those systems should be the ones fucking funding it. And then using that agreement to keep the exploit details behind closed doors until they are able to fix it.

    It’s almost like this should be an entire internal department. Maybe it could be named after the idea of keeping things secure?

    If the company making massive profits off the sales of these devices isn’t going to fund it, who is? It’s fucking insane to me that Google basically funds the security of iOS for Apple, who’s their direct competition in that market. We probably wouldn’t even know this exists if it wasn’t for stuff like that.


  • because In real life, when users see a huge performance drop, they complain

    Yeah, true, and the dead people don’t get to complain, so just prioritize performance because the dead aren’t complaining.

    /s obviously. I don’t give a fuck how much performance you gain/lose by running an exposed system. Increasing road speed limits would help people get to work faster. But more of them would be dead. Road safety comes first, convenience and speed comes second.

    I could understand people having a slightly different priority list 30 years ago when performance was shit and computers were obscure. But in this day and age, we’re making increases in performance 99.9% of the populace won’t notice and computers literally run our lives. The priority is security.

    then you have situations like Intel’s Downfall, which has sizable AVX2/AVX512 performance penalties.

    Yeah, exactly. Most people don’t utilize AVX all that much. And those that do likely have newer machines that are unnaffected. And Intel is patching it.