• 1 Post
  • 127 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle




  • When I find myself becoming irked by someone offering help I don’t need, it helps me to think of things in terms of people who slip through the gaps: the system that the social worker is a part of strives to help those who need it, and you not needing that help makes you a false positive. You were likely flagged because sometimes when someone is living in their vehicle, this is a symptom (and reinforcing factor) of their life being in disarray. That is to say that some people who superficially look a lot like you are in need of support, and not catching these people would be false negatives. Bonus complication is that many people who do need this help may also be resistant to support (for a variety of reasons).

    Given that no system is perfect, and the error rate will always be greater than zero, we can ask the hypothetical “is it better to have fewer false positives and more false negatives, or more false positives and fewer false negatives?”. Put a different way, when you’re bothered, that’s you slipping through the gaps in a system that has opted for more false positives with the goal of helping as many people who need it as possible.

    Unrelated to everything else I said, I’m glad you’ve been able to find a way of living that you’re happy in — it is a challenge when the life that is best suited for us is one that society considers “abnormal”, so I’m happy to hear about anyone who has broken into what works.







  • The data are stored, so it’s not a live-feed problem. It is an inordinate amount of data that’s stored though. I don’t actually understand this well enough to explain it well, so I’m going to quote from a book [1]. Apologies for wall of text.

    “Serial femtosecond crystallography [(SFX)] experiments produce mountains of data that require [Free Electron Laser (FEL)] facilities to provide many petabytes of storage space and large compute clusters for timely processing of user data. The route to reach the summit of the data mountain requires peak finding, indexing, integration, refinement, and phasing.” […]

    "The main reason for [steep increase in data volumes] is simple statistics. Systematic rotation of a single crystal allows all the Bragg peaks, required for structure determination, to be swept through and recorded. Serial collection is a rather inefficient way of measuring all these Bragg peak intensities because each snapshot is from a randomly oriented crystal, and there are no systematic relationships between successive crystal orientations. […]

    Consider a game of picking a card from a deck of all 52 cards until all the cards in the deck have been seen. The rotation method could be considered as analogous to picking a card from the top of the deck, looking at it and then throwing it away before picking the next, i.e., sampling without replacement. In this analogy, the faces of the cards represent crystal orientations or Bragg reflections. Only 52 turns are required to see all the cards in this case. Serial collection is akin to randomly picking a card and then putting the card back in the deck before choosing the next card, i.e., sampling with replacement (Fig. 7.1 bottom). How many cards are needed to be drawn before all 52 have been seen? Intuitively, we can see that there is no guarantee that all cards will ever be observed. However, statistically speaking, the expected number of turns to complete the task, c, is given by: where n is the total number of cards. For large n, c converges to n*log(n). That is, for n = 52, it can reasonably be expected that all 52 cards will be observed only after about 236 turns! The problem is further exacerbated because a fraction of the images obtained in an SFX experiment will be blank because the X-ray pulse did not hit a crystal. This fraction varies depending on the sample preparation and delivery methods (see Chaps. 3–5), but is often higher than 60%. The random orientation of crystals and the random picking of this orientation on every measurement represent the primary reasons why SFX data volumes are inherently larger than rotation series data.

    The second reason why SFX data volumes are so high is the high variability of many experimental parameters. [There is some randomness in the X-ray pulses themselves]. There may also be a wide variability in the crystals: their size, shape, crystalline order, and even their crystal structure. In effect, each frame in an SFX experiment is from a completely separate experiment to the others."

    The Realities of Experimental Data” "The aim of hit finding in SFX is to determine whether the snapshot contains Bragg spots or not. All the later processing stages are based on Bragg spots, and so frames which do not contain any of them are useless, at least as far as crystallographic data processing is concerned. Conceptually, hit finding seems trivial. However, in practice it can be challenging.

    “In an ideal case shown in Fig. 7.5a, the peaks are intense and there is no background noise. In this case, even a simple thresholding algorithm can locate the peaks. Unfortunately, real life is not so simple”

    It’s very cool, I wish I knew more about this. A figure I found for approximate data rate is 5GB/s per instrument. I think that’s for the European XFELS.

    Citation: [1]: Yoon, C.H., White, T.A. (2018). Climbing the Data Mountain: Processing of SFX Data. In: Boutet, S., Fromme, P., Hunter, M. (eds) X-ray Free Electron Lasers. Springer, Cham. https://doi.org/10.1007/978-3-030-00551-1_7


  • What have you found most useful from switching? I switched to emacs a while ago and still feel like a beginner (largely because I got too greedy with all the goodies at the beginning and ended up with loads of features I hadn’t learned to use yet and a messy init.el. I restarted and am adding features as I need them, to prevent that same complexity sprawl)



  • He doesn’t directly control anything with C++ — it’s just the data processing. The gist of X-ray Crystallography is that we can shoot some X-rays at a crystallised protein, that will scatter the X-rays due to diffraction, then we can take the diffraction pattern formed and do some mathemagic to figure out the electron density of the crystallised protein and from there, work out the protein’s structure

    C++ helps with the mathemagic part of that, especially because by “high throughput”, I mean that the research facility has a particle accelerator that’s over 1km long, which cost multiple billions because it can shoot super bright X-rays at a rate of up to 27,000 per second. It’s the kind of place that’s used by many research groups, and you have to apply for “beam time”. The sample is piped in front of the beam and the result is thousands of diffraction patterns that need to be matched to particular crystals. That’s where the challenge comes in.

    I am probably explaining this badly because it’s pretty cutting edge stuff that’s adjacent to what I know, but I know some of the software used is called CrystFEL. My understanding is that learning C++ was necessary for extending or modifying existing software tools, and for troubleshooting anomalous results.





  • As we saw with the COVID pandemic, even in “1st world countries”, poorer people were disproportionately affected. Fewer humans won’t help when the majority of harm to the Earth is perpetuated by a small fraction who would be disproportionately represented in a world where the majority of people died.

    I sympathise with your sentiment, because it often does feel like humans are the problem, but the reality is that we’re not. Although it can feel weirdly comforting to think of humans as inherently and innately destructive, thinking this way is a pipeline to eco-fascism, which doesn’t offer productive ways forward.