Hubble Space Telescope has literally discovered the age, size, and fate of our Universe.
Among the TV series Star Trek’s many charms are its rich universe of characters and planets. Now, the Dharma Planet Survey, in a new study led by University of Florida (UF) astronomer Jian Ge and team including Tennessee State University (TSU) astronomers Matthew Muterspaugh and Gregory Henry, has shown that science fiction may be a little less so; the Dharma project has discovered what may be Star Trek’s famed planet Vulcan.
“The new planet is a ‘super-Earth’ orbiting the star HD 26965, which is only 16 light-years from Earth, making it the closest super-Earth orbiting another Sun-like star,” says Ge. “The planet is roughly twice the size of Earth and orbits its star with a 42-day period just inside the star’s optimal habitable zone.” The discovery was made using the Dharma Endowment Foundation Telescope (DEFT), a 50-inch telescope located atop Mt. Lemmon in southern Arizona. The planet is the first “super-Earth” detected by the Dharma Survey.
“The orange-tinted HD 26965 is only slightly cooler and slightly less massive than our Sun, is approximately the same age as our Sun, and has a 10.1-year magnetic cycle nearly identical to the Sun’s 11.6-year sunspot cycle,” explains Muterspaugh, who helped to commission the Dharma spectrograph on the TSU 2-meter automatic spectroscopic telescope. “Therefore,” he adds, “HD 26965 may be an ideal host star for an advanced civilization.”
“Star Trek fans may know the star HD 26965 by its alternative moniker, 40 Eridani A,” says Henry, who collected precise brightness measurements of the star at TSU’s automated observatory needed to confirm the presence of the planet. “Vulcan was connected to 40 Eridani A in the publications ‘Star Trek 2’ by James Blish (Bantam, 1968) and ‘Star Trek Maps’ by Jeff Maynard (Bantam, 1980),” explains Henry. In a letter published in the periodical “Sky and Telescope” in July 1991, Gene Roddenberry, the creator of Star Trek, along with Sallie Baliunas, Robert Donahue, and George Nassiopoulos of the Harvard-Smithsonian Center for Astrophysics confirmed the identification of 40 Eridani A as Vulcan’s host star. The 40 Eridani star system is composed of three stars. Vulcan orbits the primary star, and the two companion stars “would gleam brilliantly in the Vulcan sky,” they wrote in their 1991 letter.
“Vulcan is the home planet of Science Officer Mr. Spock in the original ‘Star Trek’ Sci-Fi series,” says Henry. “Spock served on the starship Enterprise, whose mission was to seek out strange new worlds, a mission shared by the Dharma Planet Survey.”
“This star can be seen with the naked eye, unlike the host stars of most of the known planets discovered to date. Now anyone can see 40 Eridani on a clear night and be proud to point out Spock’s home,” says Bo Ma, a UF postdoc on the team and the first author of the paper just published in “Monthly Notices of the Royal Astronomical Society.”
“This discovery demonstrates that fully dedicated telescopes conducting high-cadence, high-precision radial velocity observations in the near future will continue to play a key role in the discovery of more super-Earths and even Earth-like planets in the habitable zones around nearby stars,” says Ge. “I am very grateful to the donor of our Dharma Planet Survey, Mr. Mickey Singer, who recognized the importance of this project and has continuously provided support to make this and future discoveries possible.”
This article was ISSUED BY THE UNIVERSITY OF FLORIDA IN GAINESVILLE
Special Announcement: We will be having a guest speaker giving a talk on “Personal Archiving: Saving Your Fandom” , starting at 1:00pm on Sept 9, 2018
This fascinating talk will start promptly at 1:00pm, so bring your friends and arrive early to get a good seat.
Happy Labor Day!!!!
Reminder, Sept Meeting is a budget meeting and accepting Money for Next years Membership. Active and Associate Members Dues need to be paid by Oct 6. 2018
From July 6-8, at its usual stomping ground of The Hunt Valley Inn-Marriott Delta, the fan- and volunteer-run Shore Leave will celebrate its 40th anniversary.
The headlining guest will be USS Enterprise Capt. James T. Kirk himself, the still vibrant, 87-year-old William Shatner.
Thank you, Joe and Stacey Cress for this wonderful article.. The science fiction and fantasy world as a whole is a foundation of their marriage (est. 2003). And that’s why, for three days nearly every summer, you’ll find them sojourning in Hunt Valley, Maryland, a relatively short trip down Interstate 83, for the annual Shore Leave sci-fi convention.
For the full article go to this site: https://www.ydr.com/story/things-to-do/2018/06/15/shore-leave-40-celebrates-geek-culture-star-trek-william-shatner-science-fiction-convention/700541002/
Data from the first NASA satellite mission dedicated to measuring the water content of soils is now being used operationally by the U.S. Department of Agriculture (USDA) to monitor global croplands and make commodity forecasts.
The Soil Moisture Active Passive mission, or SMAP, launched in 2015 and has helped map the amount of water in soils worldwide. Now, with tools developed by a team at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, SMAP soil moisture data are being incorporated into the Crop Explorer website of the USDA’s Foreign Agricultural Service, which reports on regional droughts, floods and crop forecasts. Crop Explorer is a clearinghouse for global agricultural growing conditions, such as soil moisture, temperature, precipitation, vegetation health and more.
“There’s a lot of need for understanding, monitoring and forecasting crops globally,” said John Bolten, research scientist at Goddard. “SMAP is NASA’s first satellite mission devoted to soil moisture, and this is a very straightforward approach to applying that data.”
Variations in global agricultural productivity have tremendous economic, social and humanitarian consequences. Among the users of these new SMAP data are USDA regional crop analysts who need accurate soil moisture information to better monitor and predict these variations.
“The USDA does crop forecasting activities from a global scale, and one of the main pieces of information for them is the amount of water in the soil,” said Iliana Mladenova, a research scientist at Goddard.
The USDA has used computer models that incorporate precipitation and temperature observations to indirectly calculate soil moisture. This approach, however, is prone to error in areas lacking high-quality, ground-based instrumentation. Now, Mladenova said, the agency is incorporating direct SMAP measurements of soil moisture into Crop Explorer. This allows the agriculture analysts to better predict where there could be too little, or too much, water in the soil to support crops.
These soil moisture conditions, along with tools to analyze the data, are also available on Google Earth Engine. There, researchers, nonprofit organizations, resource managers and others can access the latest data as well as archived information.
“If you have better soil moisture data and information on anomalies, you’ll be able to predict, for example, the occurrence and development of drought,” Mladenova said.
The timing of the information matters as well, she added — if there’s a short dry period early in the season, it might not have an impact on the total crop yield, but if there’s a prolonged dry spell when the grain should be forming, the crop is less likely to recover.
With global coverage every three days, SMAP can provide the Crop Explorer tool with timely updates of the soil moisture conditions that are essential for assessments and forecasts of global crop productivity.
For more than a decade, USDA Crop Explorer products have incorporated soil moisture data from satellites. It started with the Advanced Microwave Scanning Radiometer-E instrument aboard NASA’s Aqua satellite, but that instrument stopped gathering data in late 2011. Soil moisture information from the European Space Agency’s (ESA) Soil Moisture and Ocean Salinity mission is also being incorporated into some of the USDA products. This new, high-quality input from SMAP will help fill critical gaps in soil moisture information.
SMAP is managed for NASA’s Science Mission Directorate in Washington by the agency’s Jet Propulsion Laboratory in Pasadena, California, with instrument hardware and science contributions made by Goddard.
To learn more about SMAP, visit:
The USDA’s Crop Explorer tool is at:
GRACE was the Gravity Recovery and Climate Experiment. It consisted of two satellites in orbit around Earth. Launched in March of 2002, the GRACE mission accurately mapped variations in Earth’s gravity field. Designed for a nominal mission lifetime of five years, GRACE operated in an extended mission phase through October, 2017.
The Follow-On (GRACE-FO) mission is a collaboration between NASA and the German Research Centre for Geosciences (GFZ), will continue the work of monitoring changes in the world’s water cycle and surface mass, which was so well performed by the original GRACE mission.
The two GRACE-FO satellites were launched on 22 May from Vandenburg AFB in CA. The two were part of a payload on a Space-X Falcon 9 rocket that also included two Irridium Satellite Telephone spacecraft. They are currently in position relative to each other and are in system checkout.
The Falcon 9 was a bus delivering its payload. On liftoff, the first-stage engines burned for approximately 2 minutes and 45 seconds before shutting down at main engine cutoff (MECO). The Falcon 9’s first and second stages separated seconds later, at which point the second-stage engine ignited for the first burn (SES1) until the vehicle reached the altitude of the GRACE injection orbit, 305 miles (490 kilometers). During this burn , the payload fairing — the launch vehicle’s nose cone – separated into two halves like a clamshell and fell away.
When the rocket’s second stage completed its ascent to the injection orbit altitude, it pitched (its nose down 30 degrees and rolled so that one of the twin GRACE-FO satellites faced down toward Earth and the other faced up toward space. Then the second stage engine cut off (SECO).
So about 10 minutes after liftoff, a separation system on the second stage deployed the GRACE-FO satellites. Separation occured over the Pacific Ocean exactly as planned at about 17.5 degrees North latitude, 122.6 degrees West longitude. After the GRACE-FO satellites deployed, the Falcon 9 second stage coasedt for half an orbit to allow for some separation, then reignited its engine (SES2) to take the Iridium NEXT satellites to a higher orbit for deployment.
Unlike other Earth-observing satellites, which carry instruments that observe some part of the electromagnetic spectrum, the two GRACE-FO satellites themselves are the instrument. The prime instrument measures the tiny changes in the distance between the pair, which arise from the slightly varying gravitational forces of the changing mass below. Researchers produce monthly maps of water and mass change by combining this information with GPS measurements of exactly where the satellites are and accelerometer measurements of other forces acting upon the spacecraft, such as atmospheric drag.
How they work
GRACE-FO’s raw data will be a series of measurements showing how far apart two satellites are from each other. The twin satellites follow each other in orbit around the Earth, separated by about 137 miles (220 km). They will constantly send microwave signals to each other to measure the distance between them.
As the pair circles the Earth, areas of slightly stronger gravity (greater mass concentration) affect the lead satellite first, pulling it away from the trailiing satellite. As the satellites continue, the trailing satellite is pulled toward the lead satellite as it passes over the gravity anomaly. The change in distance would certainly be imperceptible to our eyes, but the extremely precise microwave ranging system on GRACE-FO is designed to detect minuscule changes in the distance between the satellites. A highly accurate accelerometer, located at each satellite’s center of mass, measures the non-gravitational accelerations (such as those due to atmospheric drag) so that only accelerations cased by gravity are considered. Satellite Global Positioning System (GPS) receivers determine the exact position of the satellite over the Earth to within a centimeter or less. All this information from the satellites will be used to construct monthly maps of the Earth’s average gravity field, offering details of how mass, in most cases water, is moving around the planet.
What they do for us
GRACE-FO tracks liquid and frozen water by measuring month-to-month changes in Earth’s gravitational pull very precisely. More than 99 percent of our planet’s gravitational pull doesn’t change from one month to the next, because it represents the mass of the solid Earth itself. But a tiny fraction of Earth’s mass is constantly on the move, and it is mostly water: Rain is falling, dew is evaporating, ocean currents are flowing, ice is melting and so on. GRACE-FO’s maps of regional variations in gravity will show us where that small fraction of overall planetary mass is moving every month.
5 Things We Didn’t Know Before GRACE
GRACE observations have been used in more than 4,300 research papers to date — a very high number for a single Earth science mission. Here’s a list of five important findings from those 4,300-plus papers.
- Melting ice sheets and dwindling aquifers are contributing to Earth’s rotational wobbles.
- A few years of heavy precipitation can cause so much water to be stored on land that global sea level rise slows or even stops briefly.
- A third of the world’s underground aquifers are being drained faster than they can be replenished.
- In the Amazon, small fires below the tree canopy may destroy more of the forest than deforestation does — implying that climatic conditions such as drought may be a greater threat to the rainforest than deforestation is.
- Australia seesaws up and down by two or three millimeters each year because of changes to Earth’s center of mass that are caused by the movement of water.
Permafrost in the coldest northern Arctic area — formerly thought to be almost unaffected by global warming because of its extreme environment — will thaw enough to become a permanent source of carbon dioxide to the atmosphere in this century. There are people alive today that will see it.
The study was led by scientist Nicholas Parazoo of NASA’s Jet Propulsion Laboratory in Pasadena. They used data on soil temperatures in Northern Alaska and Northern Siberia from the University of Alaska, Fairbanks, with a numerical model to calculate effects from the National Center for Atmospheric Research in Boulder, Colorado. The model calculates changes in carbon emissions as plants grow and permafrost thaws in response to climate change. They assessed when the Arctic will transition to a carbon source instead of the carbon-neutral area it is today where some processes remove about as much carbon from the atmosphere as other processes emit.
Permafrost is soil that has remained frozen for years or centuries under topsoil. It contains organic material, such as leaves, that froze without decaying. When the permafrost temperature rises, decay begins, releasing Carbon Dioxide and Methane. Both are greenhouse gases.
“Some of the very cold, stable permafrost in the highest latitudes in Alaska and Siberia appeared to be sheltered from extreme climate change, and we didn’t expect much impact over the next couple hundred years.” said Parazoo. But the model surprised them. It showed a speed-up of organic decay, at lower than expected temperatures, which caused earlier release greenhouse gases. The peak transition will occur in 40 to 60 years. The study calculated that as thawing continues, by the year 2300, total carbon emissions from this region will be 10 times as much as all human-produced fossil fuel emissions in 2016.
The bottom line is that global warming, as with global cooling that produced the ice ages, is a non-linear process. Once it gets to a certain point, it takes on a life of its own regardless of what humans do. Natural CO2 emissions from the extreme Arctic and the oceans (which are 70% of the world’s surface) will swamp human contributions in a self-sustaining cycle of warming that will continue until their sources are significantly depleted, at which time a new equilibrium will be established.
In today’s Education Report, I would like to talk about the protection of the environment, and two success stories. With this report, I hope to show that the way government is working today is really about the same as it has always worked.
The year was 1970. President Richard Nixon was greatly troubled by the degradation of quality of our environment. It was part of his platform during run for the Presidency, and he followed through. In his State of the Union message he delivered to Congress on Jan 22, 1970, he had this to say:
In the year 1980, will the President look back on a decade in which 70% of our people lived in metropolitan areas choked by traffic, suffocated by smog, poisoned by water, deafened by noise, and terrorized by crime? The great question of the 1970s is, shall we surrender to our surroundings, or shall we make our peace with nature and begin to make reparations for the damage we have done to our air, to our land, and to our water?
Restoring nature to its natural state is a cause beyond party and beyond factions. It has become a common cause of all the people of this country. It is a cause of particular concern to young Americans, because they more than we will reap the grim consequences of our failure to act on programs which are needed now if we are to prevent disaster later. Clean air, clean water, open spaces-these should once again be the birthright of every American. If we act now, they can be.
Six months later – on July 9, 1970 – he delivered a message to Congress to establish the Environmental Protection Agency. It it, he said:
Our national government today is not structured to make a coordinated attack on the pollutants which debase the air we breathe, the water we drink, and the land that grows our food. Indeed, the present governmental structure for dealing with environmental pollution often defies effective and concerted action. Despite its complexity, for pollution control purposes the environment must be perceived as a single, interrelated system. Present assignments of departmental responsibilities reflect this interrelatedness.
A far more effective approach to pollution control would:
- Identify pollutants.
- Trace them through the entire ecological chain, observing and recording changes in form as they occur.
- Determine the tot of man and his environment.
- Examine interactions among forms of pollution.
- Identify where in the ecological chain interdiction would be most appropriate.
One of the early actions of the EPA was to ban Dichlorodiphenyltrichloroethane, commonly known as DDT. The EPA held seven months of hearings in 1971–1972, with scientists giving evidence for and against DDT. In the summer of 1972, the EPA announced the cancellation of most uses of DDT – exempting public health uses under some conditions. Immediately after the announcement, both Environmental Defense Fund (EDF), a private environmental group, and the DDT manufacturers, filed suit against EPA. Industry sought to overturn the ban, while EDF wanted a comprehensive ban. The cases were consolidated, and in 1973 the United States Court of Appeals for the District of Columbia Circuit ruled that the EPA had acted properly in banning DDT.
Some uses of DDT continued under the public health exemption. For example, in June 1979, the California Department of Health Services was permitted to use DDT to suppress flea vectors of bubonic plague
I include DDT here because, for all its benefits, it represented a clear and present danger not just to humans, but a wealth of friendly insects, and especially to birds. It was a carcinogen, and it killed a lot of animals and insects. DDT is a long-lived chemical, and one of its many effects is to cause the shells of birds eggs to thin out. Out national bird, the Bald Eagle, was so badly affected by DDT that most of its eggs cracked and collapsed before the chicks matured. Being an apex predator, Eagles were never populous. But the combination of the chemical and loss of habitat left it an endangered species in the ‘60s. Around 1900, there were about 11,000 pairs around the Chesapeake Bay. In 1967 when they were declared endangered, there were fewer than 90 breeding pairs around the Chesapeake.
I remember hearing a EDF scientist talking about DDT in a radio interview. He said it remained about 30 years in the environment before there was enough breakdown of the chemical for its effects to die out. Sure enough, in the late ‘90s we started hearing about the comeback of Bald Eagles. In 2007, were were more than 11,000 pairs nesting up and down the Chesapeake Bay, including several in the D.C. metropolitan area. Banning DDT was not the only reason they recovered, but it was one big drivers.
I also want to mention the complete banning of Choroflourocarbons (CFCs) in the ‘90s. One of the most well known was Freon. Like DDT, CFCs have very long life (due to DDT’s low reactivity). They have an average lifetime of as much as 100 years, depending on which variation you are discussing. And they went everywhere. Studies in the ‘50s and ‘60s found them in the arctic despite a lack of people there. CFCs released into the environment drifted everywhere, including into the stratosphere where ultraviolet radiation breaks the chemical down into free radicals that combine chemically with Ozone.
Not good. Ozone is the protective layer that attenuates ultraviolet radiation that is damaging to life on the ground. Stratospheric ozone protects life on the planet by absorbing potentially harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems, and damage plant life.
Studies in the late 1970s saw a steady decline of about four percent in the total amount of ozone in the ozone layer and a much larger springtime decrease in stratospheric ozone around Earth’s polar regions (the ozone hole). In 1978 the United States banned the use of CFCs such as Freon in aerosol cans, the beginning of a long series of regulatory actions against their use. By 1987, in response to a dramatic seasonal depletion of the ozone layer over Antarctica, diplomats forged the Montreal Protocol, which called for drastic reductions in the production of CFCs. In 1989, 12 European Community nations agreed to ban the production of all CFCs by the end of the century. In 1990, diplomats met in London and voted to significantly strengthen the Montreal Protocol by calling for a complete elimination of CFCs by the year 2000. It was a long process, but production of new stocks ceased in most, if not all, countries in 1994.
Ozone levels stabilized by the mid-1990s and began to recover in the 2000s. Recovery is projected to continue over the next century, and the ozone hole is expected to reach pre-1980 levels by around 2075. We are just now beginning to see good effects. This article came across my desk this month:
For the first time, scientists have shown through direct observations of the ozone hole by a satellite instrument, built by NASA’s Jet Propulsion Laboratory in Pasadena, California, that levels of ozone-destroying chlorine are declining, resulting in less ozone depletion.
Measurements show that the decline in chlorine, resulting from an international ban on chlorine-containing human-produce chemicals called chlorofluorocarbons (CFCs), has resulted in about 20 percent less ozone depletion during the Antarctic winter than there was in 2005. The study was published Jan. 4 in the journal Geophysical Research Letters.