A conversation with Robert Hill, Head of Space Solutions at Kx  

7 Aug 2018 | , , ,
Share on:

 Kx and the Space Data Revolution.

 Robert Hill is the Director of the Northern Ireland Space Office and chair of the Northern Ireland Space Special Interest Group (NISSIG) hosted by the UK Aerospace, Defence, Security and Space trade membership group ADS. He is also a Regional Fellow with the UK Satellite Applications Catapult.

Q:  What are the big changes you have seen in the space industry since you first started?

A: If you look back 50 or 60 years,  initial space endeavours like Sputnik and the Apollo program were solely within the remit of big governments, and only the largest of commercial corporations like Boeing and Lockheed were players in the supply chain due to the enormous research and development costs.

Today that is all changed. In the last 10 to 15 years we have seen a massive democratization of access to space as governments realized they needed to tap the commerciality and innovativeness of private industry though funding, grants and other initiatives. For the most part, this has been successful   – so we see the likes of SpaceX and Blue Origin  in the upstream sector for launching payloads, and, indeed in the near future, people into space, as well as companies like 3DEO and Planet in the downstream sector to process and analyze space  data to gain insights from it. This is what we are calling New Space, or Space 4.0, and it’s a really exciting time for Kx to be involved.

Q:  You mentioned data – how has its profile changed?

A: Enormously. Despite the size of the payload involved in early space exploration – satellites were car-sized beasts – the data volumes were tiny in comparison to today. That’s for two reasons. First the sensors were rudimentary, giving us low-resolution, grainy, black-and-white images, and they  transmitted limited packages of information maybe once per day. Second, and this ties into the democratization point, we have many many more satellites in orbit. But it’s not just their number that matters. It’s the frequency, resolution and precision of readings they produce that has really exploded the amount of data we receive. So, the combination of cheaper launch options, more powerful sensors and the ability to capture and transceive  more information is bringing a new set of challenges.

Q:  What are the changes in capture profiles you mention?

Well, take the smaller CubeSats and NanoStats that we now have.  Since they occupy lower orbits, not only are they travelling at much higher speeds to maintain their orbital slot than their bigger, and therefore more distant counterparts, they find themselves in a much noisier atmosphere too. So we have data coming from sensors that are whizzing around at 8km/s,  being sent through an ever changing atmosphere that brings noise and potentially corruption into that data. That gives us two big challenges: managing the volume of data we collect and managing the bandwidth we use to transmit it.

Q:  How do you solve those problems?

 A: First you need very powerful processing capability for the ever increasing amounts of data captured by evolving payload capability, but you also need it as close to the edge as possible so that the ground segment receives only the data required for specific use cases or needs, and as a result, gets it fast. So, the trick is to do as much in-orbit processing and analysis as possible – filter out the noise there, do the aggregation there, spot the anomalies there so that what you send back is accurate, relevant and fast. Of course, there may be a need for the fuller – but cleaner – data set on the ground to aggregate with land-based observation but that can be sent later. But for audiences like, say, First Responders speed and accuracy is paramount.

Q:  You obviously saw Kx as something that helps greatly in that area

 A:  Yes, Kx is just perfect for addressing these requirements – both from a technical, and from a reliability, perspective.

On the technical side its low footprint means that it can be installed on these smaller devices and perform the processing where it is needed  in orbit. Kdb+ is optimized for dealing with high-volume, time-series, structured (and unstructured) data – exactly what these sensors collect, and will ever increasingly be producing in massive quantities. Add to that the “green” dimension of kdb+ has an extremely small  footprint which equates to lower power consumption – another critical consideration for space.

Then you have the power of kdb+’s highly efficient data processing. Gathering and processing data in-orbit is a CEP problem –data streams in high volumes, from multiple sources, in different formats, over different wavelengths, and from different angles of capture that need to be pre-processed.  And especially with machine learning, the data has to be combined with vast amounts of historical data. That’s a big ask. I know Kx can do it. I don’t know of any others that can.

Finally, we are talking about space here, beyond Earth’s atmosphere, where there is mission criticality and no facility for call-out engineers! It’s exciting to adopt new technologies, but space is a sector were reliability, robustness and data security considerations are paramount. Kx’s heritage of servicing the financial markets for over 25 years gives us that. In some respects it is possibly one of the closest markets to space in the breadth of considerations – from security and data governance to speed and accuracy. Kx has a pedigree in them all –that’s why it de-risks its adoption in the space industry.

 Q:  Can you outline some of the uses and applications of space data

A: That’s where it gets really interesting and it’s were the space industry is heading. There is still a requirement to continue to develop technical aspects of enabling space exploration but much of the focus is shifting to usability aspects of gaining insights and actually capitalizing on them. Already there are quite a few interesting things in progress as organizations such as the European Space Agency (ESA), the National Aeronautics and Space Administration (NASA) and commercial players like Airbus for example, democratize access to space and remote sensing data. From that comes commercialization and then a virtuous feedback loops starts as the downstream side starts to inform the upstream side on what further data they need and innovation, on both sides, gains momentum.

The applications vary from the esoteric to the here and now. As an example NASA FDL are working on predictions of solar storm activity because of their potential impact globally on navigation systems and electricity grids, while individual farmers  using space data to figure out when to water their crops and apply nutrients.

A: What about machine learning – how has that impacted space exploration?

Q: After  democratization, machine learning  has probably been the biggest game changer. Take solar storm prediction, for example. Believe it or not, that has been largely a manual task to date in looking, and I mean physically looking, at images and trying to detect outliers visually. It has been the same in searching for exoplanets, but now automation, via machine learning, is not only taking over those tasks but extending to things like the search for long-period comets through analysis and categorization of meteor showers.

They are the Earth-bound examples. Once you move into orbit you have things like remote sensing and situational awareness applications where individual satellites are running diagnostics and preventative maintenance algorithms that enable ground-based control to take remedial action in advance of a failure and constellations. That requires the local, on-board, processing capabilities I have mentioned here. Then you have constellations of satellites that are in communication with one another for optimal positioning and coordination. Once again, speed is essential here for communication to the ground and back upwards can introduce significant and compromising delay, especially the further we get from home.

Finally, when you consider robots or Rovers on remote planets, it is impossible to give detailed inch-by-inch directions from a control room the ground, the distance can make them inaccurate.  Instead, you want a Rover to recognize that it needs to shut itself down when it sees a sandstorm and then restart and go around the new dunes when it’s over. To do this, it needs machine learning, which needs data. A solution for many of these existing, and future, problems can be provided by Kx.

 Q:  So the potential is huge?

A: Absolutely. Today we have talked mainly about Earth Observation and remote sensing from above. What’s equally fascinating, and we mentioned it in part already, is looking the other way – observing and listening to the cosmos. Learning how Astronomy and Astrophysics is being revolutionized by new technologies and processes to unlock the secrets of the universe in which we reside. We should discuss that in more detail in our next interview.

 

 

 

SUGGESTED ARTICLES

MIT uses kdb+ for telemetry

Kx Use Case: MIT Motorsports’ kdb+ Vehicle Telemetry System

11 Oct 2018 | , , , , ,

By Nickolas Stathas MIT student Nick Stathas was the software lead for the MIT Motorsports Team’s Model Year 2018 (MY18) race car which competed in the international Formula SAE competition in June 2018. Nick built an open-source code base of embedded components for monitoring the car’s systems’ including the vehicle control unit and battery management […]

Kx Insights: Machine learning and the value of historical data

2 Aug 2018 | , , ,

Data is being generated at a faster rate now than ever before. IDC has predicted that in 2025, there will be 163 zettabytes of data generated each year—a massive increase from the 16.1 zettabytes created in 2016. These high rates of data generation are partially an outcome of the multitude of sensors found on Internet of Things (IoT) devices, the majority of which are capable of recording data many times per second. IHS estimates that the number of IoT devices in use will increase from 15.4 billion devices in 2015 to 75.4 billion in 2025, indicating that these immense rates of data generation will continue to grow even higher in the years to come.