Why the D.O.E. Wants to Delete Scientific Information by Investing in Millions?
Many people were left confused when it was revealed that the Department of Energy wanted to invest millions of dollars in deleting scientific information.
People in data science always say that there is no such thing as too much information in this world. In the 1960s, it was thought that 1 G.B. was too much information, and now in 2021, cloud computing has proven that even 10 T.B.s are not good enough.
Furthermore, data is not meant to decrease over time. It will only increase. For example, if you are researching black holes or designing a system to take down climate change, running countless simulations and recording observations over hundreds of models might leave you with hundreds of gigabytes.
Now, as you know, not every piece of information is crucial to you. Some of it might serve you in the longer run, some might interest you for a shorter period, and others might just be irrelevant. Whatever is the case, it is evident that piles of data are just mounting up, and everything is mixed.
It is only the truth when a well-known computer scientist employed at an astronomy observatory lab in New Mexico says humans are now drowning in scientific data. Everything is just shooting up. And New Mexico is more relevant than ever now because it is where state-of-art next generational VLART will be built up.
Here, VLART is the acronym for Very Large Array Radio telescope. It will not help in containing data, though, as it will eat up to 20 million G.B. of data each month as it will try to see every part of the night sky.
If you put things in perspective, the amount of data being thrown VLART would be so huge that it would require a computing system that can do a 100 quadrillion float point observation and then process that information as well. As you might have guessed already, we need a supercomputer in that scenario, and only two supercomputers exist in the modern world that can process that much data in less time.
The Big Data Problem
Now, it must be stated that astronomy is not the only field of science that is being bombarded with scientific data in each passing second. Bill Sportz, the program manager, working for the government and is a specialist in supercomputers, also made a similar statement and stated that every discipline of the scientific community is facing a crisis in controlling the surge of data.
Now, there is no denying that the projects and supercomputers that throw data at the scientists at unfathomable velocities also solve various problems and help the world get more sophisticated technology. However, the uncontrolled generation of data has created a tremendous storage problem.
Further, there is no hope that this will slow down because far more advanced supercomputers and quantum computers are being developed every year. Hence, a separate discipline in Information Technology was created, termed “Big Data.”
Now, to take the problem of a humungous amount of data into consideration, the Department of Energy organized a virtual conference to discuss all viewpoints. Many scientists and data experts took part in the discussion and gave their opinions on how to solve the problem of data flooding. They discussed the current flow of data and the massive increase that is about to come due to the enormous projects waiting in the pipeline.
It seems as if D.O.E. did not figure out any particular way, and the problem was given primary importance. It was agreed in the virtual convention that around $14 million would be provided for a research team, who will search for practical solutions to deal with the problem of big data.
It is not reducing the data that is a problem; it is figuring out which data is useful enough for storage and which redundant information needs to be deleted. All of that assortment and analysis is bound to take a lot of money and time, and it could only be done by a supercomputer, which will generate a lot of data during the process.
Nevertheless, last month, the funds to start work on removing redundant data were transferred. Around nine such projects have already received the payment, including well-known universities such as M.I.T. and national laboratories. Bill Sportz commented that they are still trying to wrap their heads around the sheer volume of data.
A roadmap has also been created by the Department of Energy, along with assembling funds. The roadmap states that all of the nine selected efforts spread all over the country would look for creative ways such as compression of data, bettering the compression programs, giving further control to researchers to analyze which data needs to go to reduction and which data needs to stay, etc.
Furthermore, other solutions such as reducing the dimensions mentioned within a data set and triggering the equipment to only record observation when a critical phenomenon occurs have also been put forward.
A report shared by slate.com states that analyzing such complex and redundant data would involve machine learning to some extent.
Byung-Joon Hun, a mathematician, has shared his views that the problem might be more complex than what people believe. Big data forces scientists to be less systematic than they want to be. Furthermore, many scientists are not even because researchers are sometimes forced to delete the data because, simply, there is no space to store the data.
Conclusion
The Department of Energy of the U.S. government is brutally taking the uncontrolled spurge of scientific data. The department is looking for solutions to reduce the redundant information without losing out on anything important.
Source :- https://softwareinformation1.wordpress.com/2021/10/25/why-the-d-o-e-wants-to-delete-scientific-information-by-investing-in-millions/
Medika White arrived on the cyber security scene in the early 2000s when virus and malware were still new and slowly evolving. Her longtime affair with writing with an interest in the cybersecurity industry, combined with her IT degree, has contributed to experience several aspects of security suite industry such as blogging at webroot.com/safe or webroot.com/safe
Comments
Post a Comment