• 0 Posts
  • 144 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle
  • Going to almost certainly be less than 1. Moving further up the food chain results in energy losses. Those fish are going to use energy for their own body and such

    For sure, which is why I said “another food source would be needed.” I had in mind something like the wild-caught fish being processed into something useful as part of a more efficient food chain, e.g. combined with efficiently-farmed plant material.

    Moreover there’s high mortality rates inside of fish farms for fish themselves.

    I don’t have any context on the other pros and cons of fish farming, so definitely not arguing whether they’re a net positive or not.






  • “Measure” is meant in the specific sense of measure theory. The prototypical example is the Lebesgue measure, which generalizes the intuitive definition of length, area, volume, etc. to N-dimensional space.

    As a pseudo definition, we may assume:

    1. The measure of a rectangle is its length times its width.

    2. The measure of the disjoint union of two sets is the sum of their measures.

    In 2), we can relax the assumption that the two sets are disjoint slightly, as long as the overlap is small (e.g. two rectangles overlapping on an edge). This suggests a definition for the measure of any set: cover it with rectangles and sum their areas. For most sets, the cover will not be exact, i.e. some rectangles will lie partially outside the set, but these inexact covers can always be refined by subdividing the overhanging rectangles. The (Lebesgue) measure of a set is then defined as the greatest lower bound of all possible such approximations by rectangles.

    There are 2 edge cases that quickly arise with this definition. One is the case of zero measure: naturally, a finite set of points has measure zero, since you can cover each point with a rectangle of arbitrarily small area, hence the greatest lower bound is 0. One can cover any countably infinite set with rectangles of area epsilon/n^(2) so that the sum can be made arbitrarily small, too. Even less intuitively, an uncountably infinite and topologically dense set of points can have measure 0 too, e.g. the Cantor set.

    The other edge case is the unmeasurable set. Above, I mentioned a subdivision process and defined the measure as the limit of that process. I took for granted that the limit exists. Indeed, it is hard to imagine otherwise, and that is precisely because under reasonably intuitive axioms (ZF + dependent choice) it is consistent to assume the limit always exists. If you take the full axiom of choice, you may “construct” a counterexample, e.g. the Vitali set. The necessity of the axiom of choice in defining this set ensures that it is difficult to gain any geometric intuition about it. Suffice it to say that the set is both too “substantial” to have measure 0, yet too “fragmented” to have any positive measure, and is therefore not well behaved enough to have a measure at all.


  • By performing measure-preserving transformations to non-measurable sets and acting surprised when at the end of the day measure isn’t preserved. I don’t blame AC for that. AC only implies the existence of a non-measurable set, which is in itself not totally counter-intuitive.





  • Kogasa@programming.devto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Get off your damn horse man, this thread is for poking fun at twats who have to correct everybody

    No, it’s not.

    Maybe you’ll learn, or go someplace where technical accuracy matters. Because here it doesn’t.

    I get it, you have no self respect




  • Kogasa@programming.devto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    3 months ago

    I am an audiophile, not an idiot. They don’t. The slim possibility of reproducing signals past 20kHz causing audible changes to the signal within audible range may technically exist, but you will never ever demonstrate the ability to detect a difference in a double blind test.

    The only reason to use a higher sample rate than 44.1kHz is to avoid resampling audio which is already in a different sample rate, e.g. CDs which are usually 48kHz or potentially “hi-fi” sources that may be 96kHz or higher. Resampling can theoretically introduce audible artifacts although a modern CPU using a modern resampling algorithm can very easily perform transparent resampling in real-time.


  • Kogasa@programming.devto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    My comment was supposed to be in reply to some lunatic spouting word salad, not a top level comment. But thanks for your effort anyway.

    I consider myself an audiophile but it doesn’t require you to be uninformed, susceptible to snake oil, or judgmental. I collect FLACs and understand that there’s no audible difference between a lossless copy and a good 320kbps cbr / v0 mp3 transcode, etc.






  • Kogasa@programming.devto196@lemmy.blahaj.zonerule
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    DACs have been very good and very cheap for years now. A $10 Apple USB dongle contains an extremely good DAC. At the consumer level, you’re paying for pretty much everything except sound quality now.

    You do need an amp for some headphones. They can even be used to deliver low power at a low noise floor for high sensitivity earbuds, but this isn’t always necessary.