• 1 Post
  • 57 Comments
Joined 1 year ago
cake
Cake day: July 9th, 2023

help-circle

  • COULD be a big deal, assuming a lot of “ifs” wind up coming to pass. Nebraska awards it’s electoral college votes on a piecemeal basis. Each of the 3 congressional districts gets 1 vote awarded to the winner of the popular vote in that district, and 2 “at-large” electoral votes are given to the overall winner of the statewide popular vote. This has only been relevant in two elections, 2008 and 2020, when the second district (which is basically just the Omaha metropolitan area) awarded 1 blue vote among a sea of red. Now, the state Republican party (no doubt assisted by national) did their damnedest to try and make things Winner Take All to prevent this situation from occurring again, but were unable to court the votes necessary in the legislature prior to time running out. In fact, all around town I see folks with signs in their yards with either a 🔵 to represent our district, or a silhouette of the state all in red, to represent the electoral voice of this district being silenced (probably not how they look at it, but my biases are what they are).

    I’m too far removed from electoral news to understand exactly how this all shakes out, but there is a possible path to it being decided by a singular electoral college votes, and the influx of 100,000 potential voters with a possible (I’m speculating, but I don’t think it’s unreasonable) blue bias in primarily CD-1 and CD-2 could help secure that vote.



  • I imagine it’s something of a difference in expected audience behavior. I would think that, for most people, looking at a few of the top comments and their replies is all the engagement with a post they want to have. So, a voting system facilitates that process by highlighting a few items the hive mind likes, and leaving the rest in relative obscurity. Whereas forum style posting sort of assumes that everyone present in a thread is in conversation with one another, hence chronological organization.



  • I see this response with some degree of frequency here on Lemmy (and Reddit before) when a movie/game bombs, or a show is cancelled, and I have to wonder how valid it is. Like, I would suspect that the population that uses Lemmy regularly and the population that takes steps to remove corporate advertising from their lives form an essentially circular Venn diagram.

    At a certain point, easy though it is to blame marketers for not getting the word out, folks need to acknowledge the fact that, when advertisers come knocking at their door, they’re turning off the porch light and closing the blinds.

    Which is not to imply that people have anything approaching an obligation to open themselves up to advertising. I’m just saying that blaming a lack of ads while running an ad blocker seems disingenuous.

    For the record, OP, not an attack on you or anything, just voicing some thoughts that have been percolating since reading about a couple high profile flops and cancellations this summer.


  • As neat and tidy as your explanation is, I think you are vastly oversimplifying the concept.

    You say the moon is real because you can see it, and you can prove it’s there by telling other people to just go look at it. Alrighty then, I’ve seen bigfoot. In fact, lots of people say they’ve seen bigfoot. Therefore he must exist too, right? The photos “prove” his existence just as much as you pointing to the sky saying the moon exists cause there it is.

    Now, I realize that there’s probably some degree of hyperbole in your statement, so I’ll walk this back a little. If the defining metric of your separation between these concepts is whether the hypothesis can be proven through experimentation, that’s all well and good. However, I would argue that, in 99.9% of cases, it’s still a belief statement. Let’s continue with the moon example, but, rather than “seeing is knowing”, let’s apply the same standard that you applied to God. So, you “know” the moon exists, not just because you can see it, but because it’s existence can be empirically proven through experimentation. What sort of experiments would you conduct to do that, exactly? Have you done those experiments? Or, like the rest of the rational world, do you accept that scientists have done those experiments already and decided, yup, moon’s there? Cause, if you’re taking someone else’s word for it, do you personally “know” what they are saying is true, or do you believe them based upon their credentials, the credentials of those who support the argument, and your own personal beliefs/knowledge?

    As another example, let’s imagine for a sec we’re philosophers/scientists of the ancient world. I have a theory that the heavier something is, the faster it will fall. You may know where I’m going with this if you remember your elementary school science classes. I believe in the power of experimental evidence, and so, to test my theory, I climb to the top of the Acropolis and drop a feather and a rock. The feather falls much more slowly than the rock. Eureka, I’ve proved my theory and therefore I now KNOW that an object’s weight affects its fall.

    Now, anyone not born in 850 BC Athens in this thread will point out that it’s a flawed experiment, since I’m not controlling for air resistance, and if you conducted the same experiment in a vacuum chamber, both objects would fall at the the same rate. However, the technology to test my hypothesis with all of the salient variables controlled did not exist at that time. So, even though it’s now widely known that my experiment was flawed, it wouldn’t have been at the time, and I would have the data to back up my theory. I could simply say try it yourself, it’s a self-evident fact.

    Finally, your statement about subjectivity of definition being an obstacle to functional language is so alarmist as to border on ridiculous. If this question were “how do you personally define the distinction between ‘yes’ and ‘no’”, then sure I can get on board a little bit more with your point. However this is much more like ‘twilight’ vs ‘dusk’. Crack open a dictionary and you’ll find that there is a stark, objective distinction between those terms, much as you pointed out that belief and knowledge have very different definitions. For the record, since I had to look it up to ensure I wasn’t telling tales here, sunset is the moment the sun finishes crossing the horizon, twilight is the period between sunset and dusk when light is still in the sky but the sun is not, and dusk is the moment the sun is 18 degrees below the horizon. So, I know that these are unique terms with specific, mutually exclusive definitions. But let me tell you something, I believe that if I randomly substituted one term for another based purely on my personal whimsy, people are gonna get what I mean regardless.


  • I suppose I cling to the old adage that a bad game is bad forever, while a delayed game may some day be good. It’s less true today than when Miyamoto said it (No Man’s Sky being the commonly cited example of a game which was able to turn its radioactive launch into a fairly positive experience), but I still believe it’s more accurate than not. I’m picking on a straw man here, but I wonder how many of those “gamers” bemoaning Halo’s long absence also look down their noses at the yearly release mill of sports games. Far as I’m concerned, new games in a franchise should come when the creators feel they have something new to showcase. A new mechanic, new engine, a new plot, whatever. Obviously, the games industry at large is perfectly happy to ok boomer me, and I’m perfectly happy to keep mining through my backlog of games which manage to be fun without live updates.



  • Would the sensation be similar to being at high altitude without oxygen? There is a Smarter Every Day video from several years ago where the host conducts simple cognitive and motor function tests in a pressure chamber which simulates high altitude atmospheric conditions. Within a couple of minutes of being off oxygen, he’s suffering from hypoxia and is unable to either continue the tests, or to mask up, despite being told the he will die if he doesn’t secure oxygen. Admittedly, it’s incredibly chilling to see the guy rendered so helpless, but, from his perspective, it did not seem particularly traumatic. As I understand it, if he had not had his mask applied for him at that point, he would have lost consciousness and then died in his sleep shortly thereafter. All things considered, not the WORST way to go. Beats getting stuck in that compartment with a leak and eventually drowning.




  • Right? Like I see folks in this thread and elsewhere echoing some of the typical things you hear when Hollywood botches an adaptation. Things like “it would be better if it was faithful to the source material” and other sentiments like that.

    However, in this case, the one aspect of the games that is easily translateable to film (the writing) seems to have aged the absolute worst. Self-referential Internet humor was a bold, unique aesthetic in 2009, but it’s been largely played out the 15 years since the og game released, or at least Borderlands’ take on that style of humor has gotten stale. Maybe the writing was better outside of 2 and Tiny Tina’s (the entries I played the most), but I sort of doubt it.

    I would not want to be tasked with adapting Borderlands. Stick close to the source material, get flamed for writing something juvenile. Diverge from the source material, get accused of not capturing the spirit of the franchise. It’s an impossible situation.


  • Yes, I believe the figure they cited was that Google earns 73% of their revenue through ads. I imagine what they would have to do is bust up the ad services in addition to the various departments of Google. Each new entity formed gets to keep revenue from ads shown on their platform maybe? E.g. YouTube gets spun off into its own thing separate from Google proper. They get to keep ad revenue from what is shown on their platform, but they don’t get to touch any revenue from sponsored search listings, or from banner ads on other websites, etc.

    That’s an approach that makes surface level sense to me, but I am neither a lawyer nor a business bro nor a tech bro. So, I don’t actually have the faintest idea if my idea bears any resemblance to reality.




  • I disagree with your assessment. To an average user, whatever winds up saved in their browser cache is there mostly unintentionally. Yes, it’s saving info from sites they choose to visit, but after that initial choice, the user is out of the loop. The browser saves what it needs to without user notification or input. I might even wager that most users are unaware of their browser cache, or don’t know what’s in it or how to access it. Therefore, I believe your metaphor perhaps confers too active a decision-making process on something that most people are completely unconscious of.

    To be clear, the strawman average user I’m using here is me. I know I have a browser cache, I know vaguely what is stored in it and why, and I know how to clear it if I’m having certain issues. That’s about it. I sure as heck don’t treat it as an archive.


  • Hmm, so, last month I began to have issues with my Chromecast for the first time. I have an old 3rd gen Chromecast attached to my bedroom television (not a smart tv) for the purpose of casting obnoxiously long video essays to fall asleep to. After like a decade of essentially hassle free operation, it suddenly stopped being able to maintain a connection to my phone. I cast a video, and after approximately 10 minutes, the cast disconnects and I get a message on my phone saying “this video cannot be played in the background”. I’ve tried ever troubleshooting technique I can think of.

    I know I shouldn’t attribute to malice what can be explained by other causes, but boy, seeing this news today sure makes me think about things like planned obsolescence.