What’s the ROI? If 15% of wild caught fish are used to support fish farms that produce twice as much, it’s not as obviously a bad thing. There’d need to be another food source though.
What’s the ROI? If 15% of wild caught fish are used to support fish farms that produce twice as much, it’s not as obviously a bad thing. There’d need to be another food source though.
Shoutout to my fort myers and cape coral homies
No but you have 8 boobs like a cat, enjoy
It’s required, but nontrivially so. It has been proven that ZF + dependent choice is consistent with the assumption that all sets of reals are Lebesgue measurable.
“Measure” is meant in the specific sense of measure theory. The prototypical example is the Lebesgue measure, which generalizes the intuitive definition of length, area, volume, etc. to N-dimensional space.
As a pseudo definition, we may assume:
The measure of a rectangle is its length times its width.
The measure of the disjoint union of two sets is the sum of their measures.
In 2), we can relax the assumption that the two sets are disjoint slightly, as long as the overlap is small (e.g. two rectangles overlapping on an edge). This suggests a definition for the measure of any set: cover it with rectangles and sum their areas. For most sets, the cover will not be exact, i.e. some rectangles will lie partially outside the set, but these inexact covers can always be refined by subdividing the overhanging rectangles. The (Lebesgue) measure of a set is then defined as the greatest lower bound of all possible such approximations by rectangles.
There are 2 edge cases that quickly arise with this definition. One is the case of zero measure: naturally, a finite set of points has measure zero, since you can cover each point with a rectangle of arbitrarily small area, hence the greatest lower bound is 0. One can cover any countably infinite set with rectangles of area epsilon/n^(2) so that the sum can be made arbitrarily small, too. Even less intuitively, an uncountably infinite and topologically dense set of points can have measure 0 too, e.g. the Cantor set.
The other edge case is the unmeasurable set. Above, I mentioned a subdivision process and defined the measure as the limit of that process. I took for granted that the limit exists. Indeed, it is hard to imagine otherwise, and that is precisely because under reasonably intuitive axioms (ZF + dependent choice) it is consistent to assume the limit always exists. If you take the full axiom of choice, you may “construct” a counterexample, e.g. the Vitali set. The necessity of the axiom of choice in defining this set ensures that it is difficult to gain any geometric intuition about it. Suffice it to say that the set is both too “substantial” to have measure 0, yet too “fragmented” to have any positive measure, and is therefore not well behaved enough to have a measure at all.
By performing measure-preserving transformations to non-measurable sets and acting surprised when at the end of the day measure isn’t preserved. I don’t blame AC for that. AC only implies the existence of a non-measurable set, which is in itself not totally counter-intuitive.
The axiom of choice doesn’t say one way or another whether the spectrum in “the standard order” (is there a standard definition of more/less gay?) is a well ordering, only that there is some well ordering.
We’re back to “crud” and “shucks” now boomer
Get off your damn horse man, this thread is for poking fun at twats who have to correct everybody
No, it’s not.
Maybe you’ll learn, or go someplace where technical accuracy matters. Because here it doesn’t.
I get it, you have no self respect
Being proud of ignorance is a really cool trait
Ok, go ahead and continue posting misinformation and getting mad about being corrected instead of just learning
I am an audiophile, not an idiot. They don’t. The slim possibility of reproducing signals past 20kHz causing audible changes to the signal within audible range may technically exist, but you will never ever demonstrate the ability to detect a difference in a double blind test.
The only reason to use a higher sample rate than 44.1kHz is to avoid resampling audio which is already in a different sample rate, e.g. CDs which are usually 48kHz or potentially “hi-fi” sources that may be 96kHz or higher. Resampling can theoretically introduce audible artifacts although a modern CPU using a modern resampling algorithm can very easily perform transparent resampling in real-time.
My comment was supposed to be in reply to some lunatic spouting word salad, not a top level comment. But thanks for your effort anyway.
I consider myself an audiophile but it doesn’t require you to be uninformed, susceptible to snake oil, or judgmental. I collect FLACs and understand that there’s no audible difference between a lossless copy and a good 320kbps cbr / v0 mp3 transcode, etc.
Bit depth is not the same as bitrate, there is no difference in the signals that can be reproduced within the range of human hearing between a sample rate of 44kHz and 96kHz
The difference is literally mathematically 0 unless you think your hearing exceeds 22kHz instead of the typical ~18 or widely-regarded maximum of 20kHz
What do you think the problem is exactly? Low sample rate? Are you familiar with the Nyquist sampling theorem?
DACs have been very good and very cheap for years now. A $10 Apple USB dongle contains an extremely good DAC. At the consumer level, you’re paying for pretty much everything except sound quality now.
You do need an amp for some headphones. They can even be used to deliver low power at a low noise floor for high sensitivity earbuds, but this isn’t always necessary.
For sure, which is why I said “another food source would be needed.” I had in mind something like the wild-caught fish being processed into something useful as part of a more efficient food chain, e.g. combined with efficiently-farmed plant material.
I don’t have any context on the other pros and cons of fish farming, so definitely not arguing whether they’re a net positive or not.