Difference between revisions of "Open Boundary Conditions: A grid for intensive study"
(30 intermediate revisions by 4 users not shown) | |||
Line 1: | Line 1: | ||
− | == Open Boundary Conditions: | + | __NOTOC__ |
+ | == Open Boundary Conditions: A grid for intensive study == | ||
'''Kym Ward''' | '''Kym Ward''' | ||
− | |||
− | |||
− | |||
− | |||
Line 20: | Line 17: | ||
| style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | MATLAB | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | MATLAB | ||
|- | |- | ||
− | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | Expanded old-school | + | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | ''Expanded old-school'' |
FSTS | FSTS | ||
Line 36: | Line 33: | ||
‘good enough’ measurements | ‘good enough’ measurements | ||
− | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | | + | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | ‘Color carries the responsibility of honesty’ |
moral relativism | moral relativism | ||
|- | |- | ||
− | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | Measurements that matter | + | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | ''Measurements that matter'' |
− | + | new materialisms | |
agential cut | agential cut | ||
− | |||
− | |||
− | |||
− | |||
Line 60: | Line 53: | ||
| style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | Isometric net | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | Isometric net | ||
− | Cuts that divide problematics in data science | + | Cuts that divide problematics in data science — atmosphere model and scales of comparison |
− | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | semiotics of | + | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | semiotics of color |
rainbow deception | rainbow deception | ||
|- | |- | ||
− | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | Gestationality | + | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:none;padding:0.0201in;" | ''Gestationality'' |
- speculative | - speculative | ||
− | life/non/life ( | + | life/non/life (problematizing distinction) |
phenomenological Relates to Scientific Prediction | phenomenological Relates to Scientific Prediction | ||
Line 91: | Line 84: | ||
− | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | | + | | style="border-top:none;border-bottom:0.05pt solid #000000;border-left:0.05pt solid #000000;border-right:0.05pt solid #000000;padding:0.0201in;" | Intuition for meaning of color map is natureculture |
data-vis as warnings not celebrations, exhaustion | data-vis as warnings not celebrations, exhaustion | ||
Line 97: | Line 90: | ||
|} | |} | ||
+ | <div class="no-text-indent"> | ||
+ | '''“I think that perhaps there is importance in starting various forms of intensive learning and intensive study”, Kym Ward explains when we ask about the grid that she devised to research Open Boundary Conditions. Kym works at Bidston Observatory Artistic Research Centre in Liverpool, a place that has historically been occupied by different research-led organizations — up to now, predominantly in the Natural Earth Sciences.<ref>“Bidston Observatory Artistic Research Centre (BOARC),” accessed October 20, 2021, http://www.bidstonobservatory.org.</ref> Originally built to measure time, latitude and the declination of the stars, in later iterations employees worked with meteorological, tidal and other marine data. Following this lineage from astronomical observation, to maritime scoping and charting, she became interested in the techno-political history of tidal prediction and started to study together with researchers from the National Oceanography Centre (NOC). In the following transcript, Kym explains us what is at stake in this work, and how it is structured.''' | ||
+ | </div> | ||
+ | === An area of interest that needs focus === | ||
+ | In the models that are used to run massive data sets, to do predictions for earth sciences or for meteorology or oceanography, there is an area of interest that needs to be focused on, because you can’t collect and process all data. For example, if you’re trying to figure out what waves will occur in a seascape, you need some edges to the object that you’re looking at. | ||
− | + | The issue with creating edges is that they just stop, that they make something finite, and things are often not finite. Waves have no edges and they don’t just end. So, if you’re trying to figure out different conditions for one area, a year in advance, you are going to have to figure out what comes in and what goes out of this imaginary realm. This is why you need what are called “open boundary conditions”: the mathematics that are applied to hundreds of sets of variables that create the ''outside'' of that model in order for it to run. | |
− | + | There are a lot of different ways to create outside boundary conditions, and there are various kinds of equations that, in all honesty, are above my head. There are differential equations depending on what your object is, and if you’re looking at waves, then you will use elliptic and hyperbolic equations. | |
− | + | The issue comes when you need to run two different kinds of data sets. You need to understand what wind is going to do to waves, for example. And if you need to know that, you are going to involve both the ocean model and the atmosphere model, which are on some level incompatible. The atmosphere model has many more data points than the ocean, something like at a ratio of 1000 to 1. What that means is that it is so much more fine grained than the ocean model, so they cannot simply be run together, for every time that there is one step of the ocean model, there is a thousand steps for the atmosphere model to run through. The open boundary conditions need to provide the sets of conditions that will allow for these models to be integrated at massively different scales. That is one example. | |
− | + | This term, “open boundary conditions”, makes sense to me, because of the gathering and gleaning that I have been doing across different disciplines, knowing that the vocabularies and discipline-specific words I am using will be warped, and perhaps not have the same equations applied to them. But coming from critical media theory, or philosophy of technology, and then moving to applied sciences is going to produce some interesting differences in timescales and steps. The reason I’m talking about this at all, is that I landed at Bidston Observatory Artistic Research Centre, and this was formerly a place for astronomical observation. From astronomical observation it moved to tidal research and then prediction and charting. The history of the observatory as a part of the artistic research centre, which it is now, leads you to the kinds of data visualizations that are produced by modeling and data collection, and the discipline of oceanography as a whole. | |
− | |||
− | This term, “open boundary conditions”, makes sense to me, because of the gathering and gleaning that I have been doing across different disciplines, knowing that the vocabularies and discipline-specific words I am using will be warped, and perhaps not have the same equations applied to them. But coming from critical media theory, or philosophy of technology, and then moving to applied sciences is going to produce some interesting differences in timescales and steps. The reason I’m talking about this at all is that I landed at Bidston Observatory Artistic Research Centre, and this was | ||
=== Modelling Waves and Swerves === | === Modelling Waves and Swerves === | ||
− | Modelling Waves and Swerves started off as a dusty scrabble around the basements. I was excited to find original IBM 1130 data punch cards, which had been used in tidal prediction. But this soon turned into scratching my head over the harmonic calculations of tidal prediction machines, and I needed more help to understand these. And so, with collaborators, we set up Modelling Waves | + | ''Modelling Waves and Swerves'' started off as a dusty scrabble around the basements.<ref>Open call for “Modeling Swerves and Waves,” accessed October 20, 2021, [http://www.bidstonobservatory.org/?modelling_waves_swerves http://www.bidstonobservatory.org/?modelling_waves_swerves].</ref> I was excited to find original IBM 1130 data punch cards, which had been used in tidal prediction. But this soon turned into scratching my head over the harmonic calculations of tidal prediction machines, and I needed more help to understand these. And so, with collaborators, we set up ''Modelling Waves and Swerves'' — an ongoing series of weekend work sessions. In our initial call-out, we beckoned to “marine data modellers, tired oceanographers, software critics and people concerned with the politics of predictive visualizations”. The tiredness was not a typo — it was intended as a mode of approach, of care, for the limits of a discipline; and to navigate between the steps of data collection, prediction and dispersal of climate change data. Repetitive conclusions of ocean warming and sea level rising are regularly released, and when these meet the reception from wider publics, which can sometimes at best be described as indifferent, surely scientists must be a little weary? |
− | |||
− | |||
− | + | So these work sessions take place in the observatory, which was formerly occupied by the National Oceanography Centre (NOC), and sits just outside of Liverpool, in the UK. The group looks at current and historical processes of data collection, assimilation and computational modelling of oceanographic data sets, on the way to their visual outputs — and these chronologically range from ink blotted wavy lines on a drum of paper, to hyper-real 3D renderings. | |
− | + | The types of data visualizations we find now, are 3D ocean current models, or colour variated global warming indices. If we are asking about the looseness of attachment between data visualization and energetic response, and why there is so little real response to those snippish heat stripes, then in an appeal to ethics and behavioural change, it might be useful to reexamine some methodologies of data science for their onto-epistemological grounds. This is the focus of “open boundary conditions”. | |
+ | One of the initial questions that the oceanographers asked us in these workshops, was why the visualizations they have been doing aren’t being received in a way which creates real change, why there is a deadening of effects when they produce their outputs even though they come in beautiful color stripes. They come in swirling movements across the globe, something that quite clearly shows the warming, why you can see sea level rise on their cross-section maps. These are obviously worrying, and if we take them seriously, they pose existential threat. | ||
− | I think there are a lot of artists and designers who would happily produce “better” visualizations, but you have to wonder what are the parameters of “better” in this case? More affective? Seemingly more | + | I think there are a lot of artists and designers who would happily produce “better” visualizations, but you have to wonder what are the parameters of “better” in this case? More affective? Seemingly more “real”? In fact, what we’re interested in is the steps to get to the visualizations in the first place. So, the collections of data, the running of models, and then the output. |
=== A grid but not a monument === | === A grid but not a monument === | ||
− | The first thing to note is the impossibility of conducting this kind of research alone:if it were important, it would be important to more people than me. So I’m not very precious about the grid that I have proposed. | + | The first thing to note is the impossibility of conducting this kind of research alone: if it were important, it would be important to more people than me. So I’m not very precious about the grid that I have proposed. It’s not a monument. I think that perhaps there is importance in starting various forms of intensive learning, intensive study, which I see there is also a desire for. |
− | I haven’t seen the desire for exploring and explaining the technological back-end but I do see the desire for trying to get to grips with understanding oceanality and the ocean in an ecological sense. So I can see that there would be amazing possibilities for working with other people, in which you would hope that it wouldn’t all be struggling with text. That it could find some visual form, that it could find some practical form, that it could find some performance form working in combination with the histories of science as they are, but also recombining to make other forms of knowledge. I would never have done this without the oceanographers and the data scientists. There is no possibility that I could have understood harmonic constants without a little bit of input. | + | I haven’t seen the desire for exploring and explaining the technological back-end but I do see the desire for trying to get to grips with understanding oceanality and the ocean in an ecological sense. So I can see that there would be amazing possibilities for working with other people, in which you would hope that it wouldn’t all be struggling with text. That it could find some visual form, that it could find some practical form, that it could find some performance form, working in combination with the histories of science as they are, but also recombining to make other forms of knowledge. I would never have done this without the oceanographers and the data scientists. There is no possibility that I could have understood harmonic constants without a little bit of input. |
− | Yes, it comes | + | Yes, it comes form a concern that by working with a critique of technological processes of oceanography, towards data visualisation, I’m only deconstructing the different inheritances of Modernity. For example, in looking at biopower through affect theory, looking at the way that color affects the regulation of the body and its response. Or looking at it through a criticism and awareness of colonial history, and how that’s built the technologies in both extractivist and utilitarian ways. There’s a legitimacy in doing that, but it doesn’t create any kind of constructive conversation with anyone that I’ve been working with- with oceanographers, with data scientists. It does create productive conversations with philosophers but that might not reach any conclusion. |
− | My suspicion was that | + | My suspicion was that certain discourses that are happening in feminist science studies, in new materialisms and in feminist phenomenology could add to an understanding that in the end, a color stripe might not make that much difference, or create inaction. To do that, rather than to just open some books and read some pages, I thought that it would be more invested and involved, and careful and considerate and honest, and also confused, to take some objects and try to talk these through discourses and questions via those objects. So, I picked three. |
− | === Watery Columns: The CTD | + | === Watery Columns: The CTD monitor === |
− | The first example I picked was a CTD Monitor. CTD Monitor is a metal instrument which gets dropped down from an amazing buoy | + | The first example I picked was a CTD Monitor. CTD Monitor is a metal instrument which gets dropped down from an amazing buoy. There will be 10 or 12 CTDs which are arranged in a ring, and they get dropped, and sink to the bottom of the ocean. And then at some point, on a timer, they are released, and they will rise. And as they rise, their little metal mouths will open up and grab a gulp of sea water at a particular level. The mouths will close and they will proceed to the top and at some point they will be collected and this happens over a certain time period. Its testing for salinity, its testing for temperature, its testing for depth. Salinity is measured by conductivity and hydrostatic pressure I think. |
− | This logic follows long history of the way | + | This logic follows long history of the way that the seascape is carved up, which the CTD instruments will rise through. Originally, it would have been a hemp rope, weighted with lead, which would be dropped from the side of a ship, As it drops, it runs through the hands of the sailors. There are knots on the rope, and each knot represents a fathom, and the fathoms are called out, and someone marks them with a quill pen. |
− | Through the architecture of | + | Through the architecture of Modernity, oceanography has the way of imagining the sea as a column. The sea is a very unstriated space that is imagined as an unchanging space. Even until today, this is how information is collected. Even the more unusual forms of data collection, such as the mini CTDs that are glued onto the heads of seals (a lot of the arctic data is from different seals who swim around). There is a GPS attached to it, and it still logged even though the seal is still swimming happily with that thing glued to its head. The sea is still divided up into a grid, at a certain depth, what is the salinity, temperature and conductivity, for example. |
− | So, even when sea mammals are put to work doing scientific investigation, and this investigation is then recalibrated into what is fundamentally a giant technological system formed on axes, really. It really brings home the quite strict ontological ground for sea exploration, and the types of relationality that happen in a vast expanse of many different types of sea lives, and many different kinds of waters. Under sea vents, tectonic plates, underwater | + | So, even when sea mammals are put to work doing scientific investigation, and this investigation is then recalibrated into what is fundamentally a giant technological system formed on axes, really. It really brings home the quite strict ontological ground for sea exploration, and the types of relationality that happen in a vast expanse of many different types of sea lives, and many different kinds of waters. Under sea vents, tectonic plates, underwater volcanoes, ecologies which are then being programmed into fundamentally the same model. The data are being used not to explore something different, but to expand Western knowledges along an axis. |
=== Spongy Model Edges: FVCOM === | === Spongy Model Edges: FVCOM === | ||
− | Another way that the seascape is absurdly chopped or divided from its messiness and never-ending movement is the construction of maritime boundaries, which are basically virtual objects in the sea, which are carved up by what is a nation state, by what is landmass. They are geopolitical artifacts. | + | Another way that the seascape is absurdly chopped or divided from its messiness and never-ending movement is the construction of maritime boundaries, which are basically virtual objects in the sea, which are carved up by what is a nation state, by what is landmass. They are geopolitical artifacts. For example, since the late 1700s, at one of the points in the Americas, at Saint Martha’s Bay, the sea is recorded all the way down that coast, over the period of a year, and the mean sea-level is found. It’s a mean sea-level, because tides go up and down, there are semi-diurnal tides, there are diurnal tides, there are mixed tides. There’s waves! There are still sea movements that are foxing oceanographers. But in any event, the sea was averaged, there was highest point, the lowest point and the mean sea level was used to construct a zero, a datum. And from this point you start to measure mountains, upwards. How many kilometers above the sea is, how can you measure the sea? You measure it from the average of the sea. It’s absurd, but it’s also the globally agreed protocol. |
− | + | So what happens when you introduce climate change into this phenomenon is that mountains start shrinking because sea levels are rising. It has sociological, geological, urban planning, planning applications, which are in end effects political. What is classified as a disappearing island, or a non-disappearing island becomes ratified. | |
− | + | FVCom is one of many multiple models that are used as a coordinate system. The example I gave earlier is just one example of data that is collected: salinity, temperature, depth, and obviously there are billions of data points that are also collected along rivers, along the coastline, and within the sea. One of the interesting things about how data is collected is that the nodes of data collection are very tightly packed around the coastlines, near rivers, and they are done on an isomorphic net, so it’s a triangular grid system that can be scaled. It can be expanded or contracted depending how close you want to zoom into that particular part of ocean, or coastline. And as you move out to sea, the grid gets a lot bigger. So the point at which the data is collected is averaged so that the data can run. And way out it into the middle of the ocean, you might have a two kilometer or three mile point between each of those corners of the triangle of this net which, anywhere between this node, gets averaged. Whereas at the coastline, you’ll have much tighter data, and the net will be in centimeters, or meters, not in miles. | |
− | FVCom is one of many | + | So FVCom is one of the many models, called “the ocean model” that we’ve been looking into. All of these models begin in the late ’60s, early ’70s and onward, they’ve been developed along the way in the intervening years and they take on more data points. What was initially not understood as being part of the ocean will then form one of the later models, for example, the biological model which is made of tiny life forms, phytoplankton and zooplankton — that came later. I already talked a little bit about how the models overlap and sync with each other. |
− | + | Sponginess is a term used to describe the boundary conditions where one massive model meets another massive model. The data which was collected to put into the model, if I describe it historically, one of the ways in which the process of modeling happens, is — someone takes measurements over the course of the coastline over a year, and the data is sent in. And the sheets of data that are sent in would be really grubby — they would perhaps be water sodden; but they were basic tabulations about the tide heights, the moon, the distances between waves. Different data like that. Before the advent of computers as we know them now, this information would be sent, in this way to Bidston Observatory, so that’s my access point into this history. And then that data would be fundamentally programmed so that the height of the tides or the wavelength, or the effect of the moon, would be run through different differential equations, and then it would be assigned a value. The value would be put into a tidal prediction machine. This machine was made of metal, with 42 brass discs. A band ran in-between these discs, each of the discs had a different name — for example, ''m2'' was the moon. And these discs would move up and down on arms. What was produced at the end of this computation- placed onto a roll of paper that was also onto a spinning drum by an arm, attached on one end with an ink pot, and the pen at the other which would draw out the harmonics — a wave. This wave was a prediction for next years tides. | |
− | + | The tidal prediction machines around the time of the Second World War could do one year’s worth of predictions in one day. Different places around the world would send in their tidal calculations and they would receive back the predictions for the year, saying at what time what tide what height. The different harmonic constants, as they were called, that were run through the tidal prediction machines, they find themselves still in the predictions nowadays. They’ve been massively updated, and there are obviously so many more data points- but you can still find them in how FVCom works. | |
− | + | One of the interesting things that happen in-between data collection, human error, different calculations and output, is that sometimes you get an output that does not resemble a harmonic — it doesn’t resemble a wave form. It needs to be smoothed. At that time, in order to correct it, it was simply rubbed out and drawn on with a pencil. The computers in the 1930s (the women who operated the machines were called computers), had partners — the “smoother”, whose task it was to correct the prediction blip. I see that there is a connection between the isomorphic grid with the averages in the middle of the sea, and the job of the “smoother”. They are both attempts to speak to what is legitimate accuracy. | |
− | One of the | + | One of the strands of research that I’ve been doing was helped a lot by a feminist science and technology scholar, Anna Carlsson-Hyslop, and she wrote a paper on Doodson, one of the previous directors of the observatory.<ref>Anna Carlsson-Hyslop, ''An Anatomy of storm surge science at Liverpool Tidal Institute 1919-1959: Forecasting, practices of calculation and patronage,'' thesis submitted to the University of Manchester for the degree of Doctor of Philosophy in the Faculty of Life Sciences, 2010.</ref> He was doing a lot of work on tidal prediction. She traces a line from his conscientious objection in the First World War to his subsequent work on aircraft ballistics. So while he doesn’t want to go to war, he doesn’t want to fight, he won’t go, he is conscripted to do mathematical scientific research because he is good at math, to do calculations on the trajectory of bombs, instead of going to war. As a part of this work that he did, he developed a way of looking at the arc of a missile using differential equations. |
− | + | Carlsson-Hyslop writes about the interaction between patronage and what is an accurate calculation. In order for these calculations to be done, somebody’s got to pay for them. Doodson is receiving a wage, but he also knows that there are “good enough” calculations for this set of conditions. When we think of the lineage of modeling, the impetus is to become more and more accurate. But its super helpful to keep in mind that there is a difference between ''accuracy'' and ''legitimacy''. The necessity for accuracy supposedly makes it more legitimate, however, it doesn’t correlate from a feminist science point of view. | |
− | + | I’m just trying to figure out why I thought that the depths were denser. Obviously they are because there is more life there. The amassed points of interest are not the same as organic life. The surface of the water is more recordable, visible, datafiable. The depths are unknown. I think I was trying to make a link to what superficial means… like does it mean whether there’s something productive in a literary sense. Superficial is able to be captured a lot easier. | |
− | + | === Squints & True Colours: CM Ocean === | |
− | + | The third object of study is called CM Ocean. It’s a programming software that is running MATLAB, in order to output the data which has then been run in the model. It is a visualization that would run alongside, and produce varying different scales of data via color. So there’s a lot of different programs which can turn ocean data into color, like heat stripes, water warming, sea warming, water level rise, salinity… lots of different kinds of data. | |
− | + | We started off this journey speaking about why visualization don’t produce effect when they have to do with existential questions like Climate Change. So it makes sense to talk about CM Ocean. | |
− | + | The data that is transformed into these visualizations are numerical, it’s quantity. And then they are translated into a scale that is absolutely not numerical, and are very subjective in terms of its reception. The aim of CM Ocean is to desubjectify and to make colour scientific. It is quite a task, which is surprising that a group would take it on. But CM Ocean is funded by BP, a multinational oil and gas company, and funded by George Bush. Its not that necessarily this has a one-on-one effect. But it’s obvious, and worth noting that an oil company and the Texas Government would like to have a regulated way of understanding the contents of the ocean. | |
− | The | + | The second thing is that the subjectivity of color is aimed for regulation, which bypasses things like taste. It bypasses any kind of physiological reception. I was thinking that perhaps the expectation that color can be reproducible, that it can be accurate, that it can correctly represent numerical data, that it can’t be divorced from numericizing color in the first place, the attributions of CMYK and RGB. If color is printed, it is different to if it’s on a screen. There are so many unworkables to this method, if you think about it. But the belief is that its good color usage carries the responsibility of honesty. So, to use colors in an honest way is the responsibility of the scientist. But what is honesty in color representation of data points? Its previous iteration, called JETS, is supposedly not so accurate, not so precise because it has the movements through the color scale with arbitrary weights. So, this has you thinking that there’s a density of whatever it is you’re looking for in the ocean because this particular part of the color scale is more dense to you, to the reception of the eye. Dark purple rather than light yellow might misrepresent the density of the object in question, but you would never know that, because this perceived symbolism is skewed. The gradient of the color has to accelerate and decelerate but it might not do that at the scale of the numerical values have on the back-end. It might be that it looks like it’s getting warmer quickly, but it depending on how this color scale is being applied, it could completely skew the numerical results that you’ve run your model for. |
− | + | It’s also worth saying, that these models are hugely energy expensive, and take around forty days to run. The step from programming to output, massive amounts of electricity are used and the possibility for it to go wrong are quite large. If so, you would have to start again and try to recalculate. As I mentioned at the start of this conversation, if we look at these instances in the process of data collection to output, solely in a critical mode, then we fail in a remarkable way: the ocean, its inhabitants, what is life and what sustains us on this planet, is still and always our object of study. We need to propose other methods of working together, of offering feedback, which differently separate our object, or work with separability itself. The grid-not-monument we’re working with here, is a try towards this. | |
− | + | [[File:Datum.jpg|thumb|none|600px|Datum point installed in the basement of Bidston Observatory, Kym Ward, 2021]] | |
=== Frame: Expanded old school === | === Frame: Expanded old school === | ||
− | I want to try to think through these three cases in an expanded, old-school, social-constructivist feminist way where you would think about where that object is being produced, who produced it, how does it have an effect on and are there any, what are | + | I want to try to think through these three cases in an expanded, old-school, social-constructivist feminist way where you would think about where that object is being produced, who produced it, how does it have an effect on and are there any, what are the linguistic and semiotic exchanges that take place because this technology has been built in this, and has been used by these people on these people. On these bodies, by bodies I mean the ocean, the body of water. |
− | It is about naming where and when something has been produced, in order to properly understand the limitations of its production, about making clear the ramifications of who and not resorting to default | + | It is about naming where and when something has been produced, in order to properly understand the limitations of its production, about making clear the ramifications of who and not resorting to default “I” or displaced I of objectivity. |
=== Frame: Measurements that matter === | === Frame: Measurements that matter === | ||
− | The second is to use some of the work that has been done over the last 10 to 20 years on | + | The second frame is to use some of the work that has been done over the last 10 to 20 years on new materialism, to try to think about how for the fact that all of these objects measure in different ways, they produce matter in the way that they measure. So the CTD Monitor measures only X, it makes an apparatus which combines and makes the world in a certain way. Which is then, only just a tiny little data point which then is put into FV Com. It’s difficult to talk about FV Com through new materialism, because it is such an object, but it can be done in a kind of reflective mode. |
− | We tried quite hard in Modeling Waves and Swerves, to work this frame. It is possible, but | + | We tried quite hard in ''Modeling Waves and Swerves'', to work this frame. It is possible, but it’s much easier to look at one instrument than it is to look at a combination of instruments that form a massive instrument. |
− | And also in the impossibility of retreat from a massive models that separate ocean life and atmosphere, for example. You need one of those models in order to have input on the data, but because they have already been divided in a certain way, you have to run with the implications of that. It is a lot easier when you go all the way out, but not when you are looking at FV Com and your looking at the back-end in order to understand as an oceanographer or a data scientist, thinking, | + | And also in the impossibility of retreat from a massive models that separate ocean life and atmosphere, for example. You need one of those models in order to have input on the data, but because they have already been divided in a certain way, you have to run with the implications of that. It is a lot easier when you go all the way out, but not when you are looking at FV Com and your looking at the back-end in order to understand as an oceanographer or a data scientist, thinking, “OK, what would the agential cut be?”. |
=== Frame: Gestationality === | === Frame: Gestationality === | ||
− | And the third strand, I call it | + | And the third strand, I call it “the feminist phenomenological”, but it really comes from reading the work the of Astrida Neimanis, who wrote ''Bodies of Water''.<ref>Astrida Neimanis, ''Bodies of Water: Posthuman Feminist Phenomenology'' (Edingburgh: Edingburgh University Press, 2017).</ref> In the book, she speaks to ontologic and onto-logics, on the ontological of amniotics, and she is calling ontologic- not ontology which would deal with what “is” — but rather a who what when where how of commons of whatever it is we call more then human interlocutors. So, she speaks about amniotic in permeable open boundary membrane kind of ways. She is not only speaking about life that forms in the way of what she calls amniotes, life which forms in an amniotic sack, but she’s also using it as a metaphor, as a fictional philosophical tool which is useful. |
− | The reason that I had centered on this is why would feminist phenomenology have something to do with different modes of technical production of the ocean? She speaks to the water, different bodies of water that were along an evolutionary process, but also she speaks to them as a mode of reception and understanding and oneness with what is happening in the ocean. So | + | The reason that I had centered on this is why would feminist phenomenology have something to do with different modes of technical production of the ocean? She speaks to the water, different bodies of water that were along an evolutionary process, but also she speaks to them as a mode of reception and understanding and oneness with what is happening in the ocean. So it’s a mode of understanding climate change, of potentially understanding sea warming. It has a lived bodily reality that we can connect to. |
The second reason that I thought it would be worthwhile to walk down this path a little bit was because if your thinking about the onto-logics of amniotics, you’re also thinking about gestationality, and gestationality also makes sense when you’re talking about predictions, ocean predictions. Because what, in the end, what this movement between data collection and running the models and producing the visualizations defines what is seen to be the ocean, and what is not seen to be the ocean, the contents of the ocean, the conditions of the ocean, the life of the ocean, what is not life in the ocean. And the kind of predictions that are accredited and valued by science are highly technologized predictions. | The second reason that I thought it would be worthwhile to walk down this path a little bit was because if your thinking about the onto-logics of amniotics, you’re also thinking about gestationality, and gestationality also makes sense when you’re talking about predictions, ocean predictions. Because what, in the end, what this movement between data collection and running the models and producing the visualizations defines what is seen to be the ocean, and what is not seen to be the ocean, the contents of the ocean, the conditions of the ocean, the life of the ocean, what is not life in the ocean. And the kind of predictions that are accredited and valued by science are highly technologized predictions. | ||
The idea of what gestationality does is that it posits that life could come, the possibility for life is there, but we don’t know what kind of life will come and what it will look like. We don’t have a clue of it, its on the move and its emergent but there is no form to it yet. And this is something that I find, compared to prediction and its vast technologies that I tried to describe, I find gestationality useful and very exciting. | The idea of what gestationality does is that it posits that life could come, the possibility for life is there, but we don’t know what kind of life will come and what it will look like. We don’t have a clue of it, its on the move and its emergent but there is no form to it yet. And this is something that I find, compared to prediction and its vast technologies that I tried to describe, I find gestationality useful and very exciting. | ||
+ | |||
+ | === Notes === | ||
+ | <references/> |
Latest revision as of 11:26, 22 April 2022
Open Boundary Conditions: A grid for intensive study
Kym Ward
Watery Columns | Spongy Model Edges | Squints & True Colours | |
CTD | FVCOM | MATLAB | |
Expanded old-school
FSTS Patronage / gender Social constructivist Who & where
|
Challenger & colonialism | Accuracy & patronage
‘good enough’ measurements |
‘Color carries the responsibility of honesty’
moral relativism |
Measurements that matter
new materialisms agential cut
|
Sammler
Datum
|
Isometric net
Cuts that divide problematics in data science — atmosphere model and scales of comparison
|
semiotics of color
rainbow deception |
Gestationality
- speculative life/non/life (problematizing distinction) phenomenological Relates to Scientific Prediction |
Wax
non-life collection Neimanis
|
Biological model and life integration
Cosmos as a technological system
|
Intuition for meaning of color map is natureculture
data-vis as warnings not celebrations, exhaustion |
“I think that perhaps there is importance in starting various forms of intensive learning and intensive study”, Kym Ward explains when we ask about the grid that she devised to research Open Boundary Conditions. Kym works at Bidston Observatory Artistic Research Centre in Liverpool, a place that has historically been occupied by different research-led organizations — up to now, predominantly in the Natural Earth Sciences.[1] Originally built to measure time, latitude and the declination of the stars, in later iterations employees worked with meteorological, tidal and other marine data. Following this lineage from astronomical observation, to maritime scoping and charting, she became interested in the techno-political history of tidal prediction and started to study together with researchers from the National Oceanography Centre (NOC). In the following transcript, Kym explains us what is at stake in this work, and how it is structured.
An area of interest that needs focus
In the models that are used to run massive data sets, to do predictions for earth sciences or for meteorology or oceanography, there is an area of interest that needs to be focused on, because you can’t collect and process all data. For example, if you’re trying to figure out what waves will occur in a seascape, you need some edges to the object that you’re looking at.
The issue with creating edges is that they just stop, that they make something finite, and things are often not finite. Waves have no edges and they don’t just end. So, if you’re trying to figure out different conditions for one area, a year in advance, you are going to have to figure out what comes in and what goes out of this imaginary realm. This is why you need what are called “open boundary conditions”: the mathematics that are applied to hundreds of sets of variables that create the outside of that model in order for it to run.
There are a lot of different ways to create outside boundary conditions, and there are various kinds of equations that, in all honesty, are above my head. There are differential equations depending on what your object is, and if you’re looking at waves, then you will use elliptic and hyperbolic equations.
The issue comes when you need to run two different kinds of data sets. You need to understand what wind is going to do to waves, for example. And if you need to know that, you are going to involve both the ocean model and the atmosphere model, which are on some level incompatible. The atmosphere model has many more data points than the ocean, something like at a ratio of 1000 to 1. What that means is that it is so much more fine grained than the ocean model, so they cannot simply be run together, for every time that there is one step of the ocean model, there is a thousand steps for the atmosphere model to run through. The open boundary conditions need to provide the sets of conditions that will allow for these models to be integrated at massively different scales. That is one example.
This term, “open boundary conditions”, makes sense to me, because of the gathering and gleaning that I have been doing across different disciplines, knowing that the vocabularies and discipline-specific words I am using will be warped, and perhaps not have the same equations applied to them. But coming from critical media theory, or philosophy of technology, and then moving to applied sciences is going to produce some interesting differences in timescales and steps. The reason I’m talking about this at all, is that I landed at Bidston Observatory Artistic Research Centre, and this was formerly a place for astronomical observation. From astronomical observation it moved to tidal research and then prediction and charting. The history of the observatory as a part of the artistic research centre, which it is now, leads you to the kinds of data visualizations that are produced by modeling and data collection, and the discipline of oceanography as a whole.
Modelling Waves and Swerves
Modelling Waves and Swerves started off as a dusty scrabble around the basements.[2] I was excited to find original IBM 1130 data punch cards, which had been used in tidal prediction. But this soon turned into scratching my head over the harmonic calculations of tidal prediction machines, and I needed more help to understand these. And so, with collaborators, we set up Modelling Waves and Swerves — an ongoing series of weekend work sessions. In our initial call-out, we beckoned to “marine data modellers, tired oceanographers, software critics and people concerned with the politics of predictive visualizations”. The tiredness was not a typo — it was intended as a mode of approach, of care, for the limits of a discipline; and to navigate between the steps of data collection, prediction and dispersal of climate change data. Repetitive conclusions of ocean warming and sea level rising are regularly released, and when these meet the reception from wider publics, which can sometimes at best be described as indifferent, surely scientists must be a little weary?
So these work sessions take place in the observatory, which was formerly occupied by the National Oceanography Centre (NOC), and sits just outside of Liverpool, in the UK. The group looks at current and historical processes of data collection, assimilation and computational modelling of oceanographic data sets, on the way to their visual outputs — and these chronologically range from ink blotted wavy lines on a drum of paper, to hyper-real 3D renderings.
The types of data visualizations we find now, are 3D ocean current models, or colour variated global warming indices. If we are asking about the looseness of attachment between data visualization and energetic response, and why there is so little real response to those snippish heat stripes, then in an appeal to ethics and behavioural change, it might be useful to reexamine some methodologies of data science for their onto-epistemological grounds. This is the focus of “open boundary conditions”.
One of the initial questions that the oceanographers asked us in these workshops, was why the visualizations they have been doing aren’t being received in a way which creates real change, why there is a deadening of effects when they produce their outputs even though they come in beautiful color stripes. They come in swirling movements across the globe, something that quite clearly shows the warming, why you can see sea level rise on their cross-section maps. These are obviously worrying, and if we take them seriously, they pose existential threat.
I think there are a lot of artists and designers who would happily produce “better” visualizations, but you have to wonder what are the parameters of “better” in this case? More affective? Seemingly more “real”? In fact, what we’re interested in is the steps to get to the visualizations in the first place. So, the collections of data, the running of models, and then the output.
A grid but not a monument
The first thing to note is the impossibility of conducting this kind of research alone: if it were important, it would be important to more people than me. So I’m not very precious about the grid that I have proposed. It’s not a monument. I think that perhaps there is importance in starting various forms of intensive learning, intensive study, which I see there is also a desire for.
I haven’t seen the desire for exploring and explaining the technological back-end but I do see the desire for trying to get to grips with understanding oceanality and the ocean in an ecological sense. So I can see that there would be amazing possibilities for working with other people, in which you would hope that it wouldn’t all be struggling with text. That it could find some visual form, that it could find some practical form, that it could find some performance form, working in combination with the histories of science as they are, but also recombining to make other forms of knowledge. I would never have done this without the oceanographers and the data scientists. There is no possibility that I could have understood harmonic constants without a little bit of input.
Yes, it comes form a concern that by working with a critique of technological processes of oceanography, towards data visualisation, I’m only deconstructing the different inheritances of Modernity. For example, in looking at biopower through affect theory, looking at the way that color affects the regulation of the body and its response. Or looking at it through a criticism and awareness of colonial history, and how that’s built the technologies in both extractivist and utilitarian ways. There’s a legitimacy in doing that, but it doesn’t create any kind of constructive conversation with anyone that I’ve been working with- with oceanographers, with data scientists. It does create productive conversations with philosophers but that might not reach any conclusion.
My suspicion was that certain discourses that are happening in feminist science studies, in new materialisms and in feminist phenomenology could add to an understanding that in the end, a color stripe might not make that much difference, or create inaction. To do that, rather than to just open some books and read some pages, I thought that it would be more invested and involved, and careful and considerate and honest, and also confused, to take some objects and try to talk these through discourses and questions via those objects. So, I picked three.
Watery Columns: The CTD monitor
The first example I picked was a CTD Monitor. CTD Monitor is a metal instrument which gets dropped down from an amazing buoy. There will be 10 or 12 CTDs which are arranged in a ring, and they get dropped, and sink to the bottom of the ocean. And then at some point, on a timer, they are released, and they will rise. And as they rise, their little metal mouths will open up and grab a gulp of sea water at a particular level. The mouths will close and they will proceed to the top and at some point they will be collected and this happens over a certain time period. Its testing for salinity, its testing for temperature, its testing for depth. Salinity is measured by conductivity and hydrostatic pressure I think.
This logic follows long history of the way that the seascape is carved up, which the CTD instruments will rise through. Originally, it would have been a hemp rope, weighted with lead, which would be dropped from the side of a ship, As it drops, it runs through the hands of the sailors. There are knots on the rope, and each knot represents a fathom, and the fathoms are called out, and someone marks them with a quill pen.
Through the architecture of Modernity, oceanography has the way of imagining the sea as a column. The sea is a very unstriated space that is imagined as an unchanging space. Even until today, this is how information is collected. Even the more unusual forms of data collection, such as the mini CTDs that are glued onto the heads of seals (a lot of the arctic data is from different seals who swim around). There is a GPS attached to it, and it still logged even though the seal is still swimming happily with that thing glued to its head. The sea is still divided up into a grid, at a certain depth, what is the salinity, temperature and conductivity, for example.
So, even when sea mammals are put to work doing scientific investigation, and this investigation is then recalibrated into what is fundamentally a giant technological system formed on axes, really. It really brings home the quite strict ontological ground for sea exploration, and the types of relationality that happen in a vast expanse of many different types of sea lives, and many different kinds of waters. Under sea vents, tectonic plates, underwater volcanoes, ecologies which are then being programmed into fundamentally the same model. The data are being used not to explore something different, but to expand Western knowledges along an axis.
Spongy Model Edges: FVCOM
Another way that the seascape is absurdly chopped or divided from its messiness and never-ending movement is the construction of maritime boundaries, which are basically virtual objects in the sea, which are carved up by what is a nation state, by what is landmass. They are geopolitical artifacts. For example, since the late 1700s, at one of the points in the Americas, at Saint Martha’s Bay, the sea is recorded all the way down that coast, over the period of a year, and the mean sea-level is found. It’s a mean sea-level, because tides go up and down, there are semi-diurnal tides, there are diurnal tides, there are mixed tides. There’s waves! There are still sea movements that are foxing oceanographers. But in any event, the sea was averaged, there was highest point, the lowest point and the mean sea level was used to construct a zero, a datum. And from this point you start to measure mountains, upwards. How many kilometers above the sea is, how can you measure the sea? You measure it from the average of the sea. It’s absurd, but it’s also the globally agreed protocol.
So what happens when you introduce climate change into this phenomenon is that mountains start shrinking because sea levels are rising. It has sociological, geological, urban planning, planning applications, which are in end effects political. What is classified as a disappearing island, or a non-disappearing island becomes ratified.
FVCom is one of many multiple models that are used as a coordinate system. The example I gave earlier is just one example of data that is collected: salinity, temperature, depth, and obviously there are billions of data points that are also collected along rivers, along the coastline, and within the sea. One of the interesting things about how data is collected is that the nodes of data collection are very tightly packed around the coastlines, near rivers, and they are done on an isomorphic net, so it’s a triangular grid system that can be scaled. It can be expanded or contracted depending how close you want to zoom into that particular part of ocean, or coastline. And as you move out to sea, the grid gets a lot bigger. So the point at which the data is collected is averaged so that the data can run. And way out it into the middle of the ocean, you might have a two kilometer or three mile point between each of those corners of the triangle of this net which, anywhere between this node, gets averaged. Whereas at the coastline, you’ll have much tighter data, and the net will be in centimeters, or meters, not in miles.
So FVCom is one of the many models, called “the ocean model” that we’ve been looking into. All of these models begin in the late ’60s, early ’70s and onward, they’ve been developed along the way in the intervening years and they take on more data points. What was initially not understood as being part of the ocean will then form one of the later models, for example, the biological model which is made of tiny life forms, phytoplankton and zooplankton — that came later. I already talked a little bit about how the models overlap and sync with each other.
Sponginess is a term used to describe the boundary conditions where one massive model meets another massive model. The data which was collected to put into the model, if I describe it historically, one of the ways in which the process of modeling happens, is — someone takes measurements over the course of the coastline over a year, and the data is sent in. And the sheets of data that are sent in would be really grubby — they would perhaps be water sodden; but they were basic tabulations about the tide heights, the moon, the distances between waves. Different data like that. Before the advent of computers as we know them now, this information would be sent, in this way to Bidston Observatory, so that’s my access point into this history. And then that data would be fundamentally programmed so that the height of the tides or the wavelength, or the effect of the moon, would be run through different differential equations, and then it would be assigned a value. The value would be put into a tidal prediction machine. This machine was made of metal, with 42 brass discs. A band ran in-between these discs, each of the discs had a different name — for example, m2 was the moon. And these discs would move up and down on arms. What was produced at the end of this computation- placed onto a roll of paper that was also onto a spinning drum by an arm, attached on one end with an ink pot, and the pen at the other which would draw out the harmonics — a wave. This wave was a prediction for next years tides.
The tidal prediction machines around the time of the Second World War could do one year’s worth of predictions in one day. Different places around the world would send in their tidal calculations and they would receive back the predictions for the year, saying at what time what tide what height. The different harmonic constants, as they were called, that were run through the tidal prediction machines, they find themselves still in the predictions nowadays. They’ve been massively updated, and there are obviously so many more data points- but you can still find them in how FVCom works.
One of the interesting things that happen in-between data collection, human error, different calculations and output, is that sometimes you get an output that does not resemble a harmonic — it doesn’t resemble a wave form. It needs to be smoothed. At that time, in order to correct it, it was simply rubbed out and drawn on with a pencil. The computers in the 1930s (the women who operated the machines were called computers), had partners — the “smoother”, whose task it was to correct the prediction blip. I see that there is a connection between the isomorphic grid with the averages in the middle of the sea, and the job of the “smoother”. They are both attempts to speak to what is legitimate accuracy.
One of the strands of research that I’ve been doing was helped a lot by a feminist science and technology scholar, Anna Carlsson-Hyslop, and she wrote a paper on Doodson, one of the previous directors of the observatory.[3] He was doing a lot of work on tidal prediction. She traces a line from his conscientious objection in the First World War to his subsequent work on aircraft ballistics. So while he doesn’t want to go to war, he doesn’t want to fight, he won’t go, he is conscripted to do mathematical scientific research because he is good at math, to do calculations on the trajectory of bombs, instead of going to war. As a part of this work that he did, he developed a way of looking at the arc of a missile using differential equations.
Carlsson-Hyslop writes about the interaction between patronage and what is an accurate calculation. In order for these calculations to be done, somebody’s got to pay for them. Doodson is receiving a wage, but he also knows that there are “good enough” calculations for this set of conditions. When we think of the lineage of modeling, the impetus is to become more and more accurate. But its super helpful to keep in mind that there is a difference between accuracy and legitimacy. The necessity for accuracy supposedly makes it more legitimate, however, it doesn’t correlate from a feminist science point of view.
I’m just trying to figure out why I thought that the depths were denser. Obviously they are because there is more life there. The amassed points of interest are not the same as organic life. The surface of the water is more recordable, visible, datafiable. The depths are unknown. I think I was trying to make a link to what superficial means… like does it mean whether there’s something productive in a literary sense. Superficial is able to be captured a lot easier.
Squints & True Colours: CM Ocean
The third object of study is called CM Ocean. It’s a programming software that is running MATLAB, in order to output the data which has then been run in the model. It is a visualization that would run alongside, and produce varying different scales of data via color. So there’s a lot of different programs which can turn ocean data into color, like heat stripes, water warming, sea warming, water level rise, salinity… lots of different kinds of data.
We started off this journey speaking about why visualization don’t produce effect when they have to do with existential questions like Climate Change. So it makes sense to talk about CM Ocean.
The data that is transformed into these visualizations are numerical, it’s quantity. And then they are translated into a scale that is absolutely not numerical, and are very subjective in terms of its reception. The aim of CM Ocean is to desubjectify and to make colour scientific. It is quite a task, which is surprising that a group would take it on. But CM Ocean is funded by BP, a multinational oil and gas company, and funded by George Bush. Its not that necessarily this has a one-on-one effect. But it’s obvious, and worth noting that an oil company and the Texas Government would like to have a regulated way of understanding the contents of the ocean.
The second thing is that the subjectivity of color is aimed for regulation, which bypasses things like taste. It bypasses any kind of physiological reception. I was thinking that perhaps the expectation that color can be reproducible, that it can be accurate, that it can correctly represent numerical data, that it can’t be divorced from numericizing color in the first place, the attributions of CMYK and RGB. If color is printed, it is different to if it’s on a screen. There are so many unworkables to this method, if you think about it. But the belief is that its good color usage carries the responsibility of honesty. So, to use colors in an honest way is the responsibility of the scientist. But what is honesty in color representation of data points? Its previous iteration, called JETS, is supposedly not so accurate, not so precise because it has the movements through the color scale with arbitrary weights. So, this has you thinking that there’s a density of whatever it is you’re looking for in the ocean because this particular part of the color scale is more dense to you, to the reception of the eye. Dark purple rather than light yellow might misrepresent the density of the object in question, but you would never know that, because this perceived symbolism is skewed. The gradient of the color has to accelerate and decelerate but it might not do that at the scale of the numerical values have on the back-end. It might be that it looks like it’s getting warmer quickly, but it depending on how this color scale is being applied, it could completely skew the numerical results that you’ve run your model for.
It’s also worth saying, that these models are hugely energy expensive, and take around forty days to run. The step from programming to output, massive amounts of electricity are used and the possibility for it to go wrong are quite large. If so, you would have to start again and try to recalculate. As I mentioned at the start of this conversation, if we look at these instances in the process of data collection to output, solely in a critical mode, then we fail in a remarkable way: the ocean, its inhabitants, what is life and what sustains us on this planet, is still and always our object of study. We need to propose other methods of working together, of offering feedback, which differently separate our object, or work with separability itself. The grid-not-monument we’re working with here, is a try towards this.
Frame: Expanded old school
I want to try to think through these three cases in an expanded, old-school, social-constructivist feminist way where you would think about where that object is being produced, who produced it, how does it have an effect on and are there any, what are the linguistic and semiotic exchanges that take place because this technology has been built in this, and has been used by these people on these people. On these bodies, by bodies I mean the ocean, the body of water.
It is about naming where and when something has been produced, in order to properly understand the limitations of its production, about making clear the ramifications of who and not resorting to default “I” or displaced I of objectivity.
Frame: Measurements that matter
The second frame is to use some of the work that has been done over the last 10 to 20 years on new materialism, to try to think about how for the fact that all of these objects measure in different ways, they produce matter in the way that they measure. So the CTD Monitor measures only X, it makes an apparatus which combines and makes the world in a certain way. Which is then, only just a tiny little data point which then is put into FV Com. It’s difficult to talk about FV Com through new materialism, because it is such an object, but it can be done in a kind of reflective mode.
We tried quite hard in Modeling Waves and Swerves, to work this frame. It is possible, but it’s much easier to look at one instrument than it is to look at a combination of instruments that form a massive instrument.
And also in the impossibility of retreat from a massive models that separate ocean life and atmosphere, for example. You need one of those models in order to have input on the data, but because they have already been divided in a certain way, you have to run with the implications of that. It is a lot easier when you go all the way out, but not when you are looking at FV Com and your looking at the back-end in order to understand as an oceanographer or a data scientist, thinking, “OK, what would the agential cut be?”.
Frame: Gestationality
And the third strand, I call it “the feminist phenomenological”, but it really comes from reading the work the of Astrida Neimanis, who wrote Bodies of Water.[4] In the book, she speaks to ontologic and onto-logics, on the ontological of amniotics, and she is calling ontologic- not ontology which would deal with what “is” — but rather a who what when where how of commons of whatever it is we call more then human interlocutors. So, she speaks about amniotic in permeable open boundary membrane kind of ways. She is not only speaking about life that forms in the way of what she calls amniotes, life which forms in an amniotic sack, but she’s also using it as a metaphor, as a fictional philosophical tool which is useful.
The reason that I had centered on this is why would feminist phenomenology have something to do with different modes of technical production of the ocean? She speaks to the water, different bodies of water that were along an evolutionary process, but also she speaks to them as a mode of reception and understanding and oneness with what is happening in the ocean. So it’s a mode of understanding climate change, of potentially understanding sea warming. It has a lived bodily reality that we can connect to.
The second reason that I thought it would be worthwhile to walk down this path a little bit was because if your thinking about the onto-logics of amniotics, you’re also thinking about gestationality, and gestationality also makes sense when you’re talking about predictions, ocean predictions. Because what, in the end, what this movement between data collection and running the models and producing the visualizations defines what is seen to be the ocean, and what is not seen to be the ocean, the contents of the ocean, the conditions of the ocean, the life of the ocean, what is not life in the ocean. And the kind of predictions that are accredited and valued by science are highly technologized predictions.
The idea of what gestationality does is that it posits that life could come, the possibility for life is there, but we don’t know what kind of life will come and what it will look like. We don’t have a clue of it, its on the move and its emergent but there is no form to it yet. And this is something that I find, compared to prediction and its vast technologies that I tried to describe, I find gestationality useful and very exciting.
Notes
- ↑ “Bidston Observatory Artistic Research Centre (BOARC),” accessed October 20, 2021, http://www.bidstonobservatory.org.
- ↑ Open call for “Modeling Swerves and Waves,” accessed October 20, 2021, http://www.bidstonobservatory.org/?modelling_waves_swerves.
- ↑ Anna Carlsson-Hyslop, An Anatomy of storm surge science at Liverpool Tidal Institute 1919-1959: Forecasting, practices of calculation and patronage, thesis submitted to the University of Manchester for the degree of Doctor of Philosophy in the Faculty of Life Sciences, 2010.
- ↑ Astrida Neimanis, Bodies of Water: Posthuman Feminist Phenomenology (Edingburgh: Edingburgh University Press, 2017).