Facebook Icon Youtube Icon Twitter Icon Flickr Icon Vimeo Icon RSS Icon Itunes Icon Pinterest Icon
Friday, January 20, 2017

Search Google Appliance

Wood Windows are Cooler than Glass

August 16, 2016

Lee Tune 301-405-4679, Martha Heil 301-405-0876

UMD study shows natural microstructures in transparent wood key to lighting and insulation advantages

COLLEGE PARK, Md. – Engineers at the A. James Clark School of Engineering at the University of Maryland demonstrate in a new study that windows made of transparent wood could provide more even and consistent natural lighting and better energy efficiency than glass.    

Wood composite as an energy efficient building material: Guided sunlight transmittance and effective thermal insulation. In a paper published in the peer-reviewed journal Advanced Energy Materials, the team, headed by Liangbing Hu of UMD’s Department of Materials Science and Engineering and the Energy Research Center lay out research showing that their transparent wood provides better thermal insulation and lets in nearly as much light as glass, while eliminating glare and providing uniform and consistent indoor lighting.  The findings advance earlier published work on their development of transparent wood.  

The transparent wood lets through just a little bit less light than glass, but a lot less heat, said Tian Li, the lead author of the new study.  “It is very transparent, but still allows for a little bit of privacy because it is not completely see-through. We also learned that the channels in the wood transmit light with wavelengths around the range of the wavelengths of visible light, but that it blocks the wavelengths that carry mostly heat,” said Li. 

The team’s findings were derived, in part, from tests on tiny model house with a transparent wood panel in the ceiling that the team built. The tests showed that the light was more evenly distributed around a space with a transparent wood roof than a glass roof.

The channels in the wood direct visible light straight through the material, but the cell structure that still remains bounces the light around just a little bit, a property called haze. This means the light does not shine directly into your eyes, making it more comfortable to look at. The team photographed the transparent wood’s cell structure in UMD’s Advanced Imaging and Microscopy (AIM) Lab.

Transparent wood still has all the cell structures that comprised the original piece of wood. The wood is cut against the grain, so that the channels that drew water and nutrients up from the roots lie along the shortest dimension of the window. The new transparent wood uses theses natural channels in wood to guide the sunlight through the wood.   

As the sun passes over a house with glass windows, the angle at which light shines through the glass changes as the sun moves. With windows or panels made of transparent wood instead of glass, as the sun moves across the sky, the channels in the wood direct the sunlight in the same way every time. 

"This means your cat would not have to get up out of its nice patch of sunlight every few minutes and move over," Li said. "The sunlight would stay in the same place. Also, the room would be more equally lighted at all times."

Working with transparent wood is similar to working with natural wood, the researchers said.  However, their transparent wood is waterproof due to its polymer component. It also is much less breakable than glass because the cell structure inside resists shattering. 

The research team has recently patented their process for making transparent wood.  The process starts with bleaching from the wood all of the lignin, which is a component in the wood that makes it both brown and strong.  The wood is then soaked in epoxy, which adds strength back in and also makes the wood clearer.  The team has used tiny squares of linden wood about 2 cm x 2 cm, but the wood can be any size, the researchers said.

UMD Research Finds Maternal Death Rate Increasing in U.S.

August 11, 2016

Sara Gavin 301-405-1733

COLLEGE PARK, Md. — The number of women who die during or soon after pregnancy is on the rise in the United States, while on the decline internationally, according to new research from the University of Maryland Population Research Center (MPRC).

The U.S. maternal mortality rate rose by nearly 27 percent between 2000 and 2014, according to the study. For every 100,000 live births, nearly 24 women died during, or within 42 days after pregnancy in 2014. That was up from nearly 19 per 100,000 in 2000.

“It’s important to note that maternal death is still a rare event, but it is of great concern that the rate is increasing, rather than improving,” said the study’s lead author Dr. Marian MacDormanDr. Marian MacDorman, an MPRC research professor. “Maternal mortality is an important indicator of the overall quality of health care both nationally and internationally.”

MacDorman noted that some of the national increase in maternal deaths has to do with better reporting. In 2003, U.S. states began revising their death certificates to include specific questions about pregnancy. However, the 27% increase was found after taking into account these changes in reporting, MacDorman said. What makes the U.S. statistics even more discouraging is that the numbers are trending in the opposite direction around the world: Study authors say the United States would rank 30th on a list of 31 countries reporting data on maternal mortality to the Organization for Economic Cooperation and Development—beating out only Mexico.

pregnant woman“Clearly, the U.S. maternal mortality rate is moving in the wrong direction,” MacDorman said. “There is a need to redouble efforts to prevent maternal deaths and improve maternity care for the 4 million U.S. women giving birth each year.”

The UMD research team did discover one bright spot amid a glum national picture. California showed a marked decline in maternal mortality from 2003 to 2014, following concerted efforts in the state to address the issue, including a statewide pregnancy-associated mortality review and initiatives focused on preventing some of the most common contributors to maternal death; obstetric hemorrhage and preeclampsia.

“These efforts appear to have reduced maternal mortality in California and could serve as a model for other states,” MacDorman said.




Clear Link Found between Warming Climate and Rise in Ocean-borne Bacterial Illnesses

August 10, 2016

Lee Tune 301-405-4679

COLLEGE PARK, Md. – A new UMD-led, international study shows that over the past half century there has been a clear correlation between warming of North Atlantic waters, increasing numbers of Vibrio bacteria in those waters, and rising numbers of people along U.S. and European North Atlantic coasts who have become infected by pathogenic Vibrio bacteria – including species that can cause life-threatening infections.  

In a paper published online on August 8, 2016, in the journal Proceedings of the National Academy of Sciences (PNAS), University of Maryland Distinguished University Professor Rita Colwell and co-authors from Italy, Britain, and Germany write that: “The evidence is strong that the ongoing climate change is influencing outbreaks of Vibrio infections on a worldwide scale.” However, they say that to their knowledge theirs is the first study to provide evidence linking decades of climate warming, Vibrio abundance and Vibrio-associated disease.  

Areas in the temperate North Atlantic where samples were collected for retrospective molecular studies of Vibrio populations over the period 1958–2011

“It will come as a surprise to the public to know that there is this direct connection between the oceans and human health, with respect to infectious disease,” said Colwell, who has studied cholera for nearly 50 years and has written more than 750 publications. A former director of the National Science Foundation and former president of the American Association for the Advancement of Science, Colwell is currently a distinguished university professor at the University of Maryland and the Johns Hopkins University's Bloomberg School of Public Health, a member of the National Academy of Science, and Chairman and Global Science Officer of CosmosID, Inc. 

Vibrio bacteria are found in large numbers among the small and microscopic organisms that constitute marine plankton. There are more than 100 Vibrio bacteria species that can cause disease in animals, with about a dozen that are human pathogens. Cholera, an acute diarrheal infection caused by ingestion of food or water contaminated by Vibrio cholerae, is responsible for an estimated 3–5 million illnesses and more than 100,000 deaths every year, according to the World Health Organization. 

In the United States, the CDC estimates that 80,000 people become sick from vibrio infections and 100 die from their infection every year. Some Vibrio species, such as Vibrio vulnificus, can get into the bloodstream. Half the people who get a Vibrio vulnificus infection die, sometimes within a day or two of becoming ill, others survive only after having limbs amputated.

Uncovering Vibrio data in 50-year-old samples

Because there was little existing data on ocean microbes covering the time and geographical scales needed for their study, the team developed a novel, technically challenging approach to studying the long term changes in the populations of Vibrio bacteria. Their approach took advantage of the Continuous Plankton Recorder, one of the most geographically extensive and longest time duration archives of marine biological samples. However, using this archive required the team to overcome the formidable challenge of recovering testable DNA from formalin-fixed samples of plankton that ranged from recently collected to more than 50 years old. Molecular analysis of these DNA samples was then conducted to find and identify types and amounts of Vibrio bacteria present. 

The researchers analyzed 133 samples collected during the past half-century at nine locations: northern North Sea, southern North Sea, western English Channel, Iberian coast, Iceland coast, Irish Sea, Newfoundland, Nova Scotia, and the North Atlantic. 

The team’s groundbreaking analysis is par for the course, for Colwell, a microbiologist, whose work bridges many other areas, including ecology, infectious disease, public health, computer and satellite technology and international diplomacy. She has made a career of novel approaches and outside-the-box thinking. Previous breakthroughs include her research showing that plankton comprises an environmental reservoir for human Vibrio cholerae infection via contaminated drinking water. This finding blew apart the then conventional wisdom that cholera was spread only by person to person contact. Her development of the first model to apply remote satellite imaging to track and predict outbreaks of cholera before they occur became a prototype for infectious disease monitoring and prevention around the world. 

Climate influence on Vibrio and associated human diseases during the past half-century in the coastal North Atlantic, Luigi Vezzulli, Chiara Grande, and Carla Pruzzo, University of Genoa; Philip C. Reid, Pierre Hélaouët, and Martin Edwards, Sir Alister Hardy Foundation for Ocean Science; Martin Edwards, University of Plymouth; Manfred G. Höfle and Ingrid Brettar, Helmholtz Centre for Infection Research, Rita R. Colwell, University of Maryland and Johns Hopkins Bloomberg School of Public Health. 


UMD Researchers Develop Tool to Counter Public Health IT Challenges

August 9, 2016

Kenyon Crowley 301-405-9593
Ritu Agarwal 301-405-3121

Zika Brings Issue to Forefront

COLLEGE PARK, Md. - Front-line protection of U.S. communities against disease epidemics relies on seamless information sharing between public health officials and doctors, plus the wherewithal to act on that data. But health departments have faltered in this mission by lacking guidance to effectively strategize about appropriate “IT investments. And incidents like the current Zika crisis bring the issue to the forefront,” says Ritu Agarwal, Robert H. Smith Dean's Chair of Information Systems and Senior Associate Dean for Faculty and Research at the University of Maryland’s Robert H. Smith School of Business.

Agarwal, with a team of UMD researchers, recently finished a two-year “intensive analysis” of the rollout of an electronic health records system in Montgomery County, Md., and a local primary care coalition, which works with a system of hospitals and clinics designed to provide safety net services to low-income patients.

“We uncovered a host of barriers and obstacles to effective use of information, including the complexity and usability of the software, the inability of the software to support certain unique public health reporting needs, the learning curve for public health workers, and the lack of standards for effective data exchange,” Agarwal says. “All of this does not bode well, either for crisis response or for proactive crisis anticipation.” 

Their findings are published in Frontiers in Public Health Services and Systems Research and detail a new tool, a Public Health Information Technology (PHIT) Maturity Index, to better understand and counter the shortcomings they observed.

“Health departments can apply the index to assessing their IT capabilities, benchmarking with their peers, setting specific goals and fostering a cycle of continuous improvement,” says coauthor and researcher Kenyon Crowley, deputy director of the Center for Health Information and Decision Systems (CHIDS) in the Smith School.

Prior to late-July confirmation of U.S. Zika cases, Centers for Disease Control and Prevention director Thomas Frieden warned: “Make no mistake: The Zika virus is an emergency that we need to address.” But Congress recessed for summer without approving emergency funding to combat the virus linked to microcephaly-stricken newborns.

But the challenges facing public health managers run deeper than a lack of funding, says Agarwal. “Health officials need to know the source of the infection, who the infected individual has contacted, where it occurred and the circumstances under which it occurred. The list goes on,” she says. “In other words, a complete and accurate picture of every incident is the foundation for developing an effective response strategy.”

“Public health managers can deploy the [PHIT Maturity] index to “measure their departments’ progress in using IT to support its public health mission, or in other words, its journey towards maturity,” Agarwal says.

Agarwal says “fulfilling the mission” broadly would mean information from Zika diagnoses, for example, is flowing to the right public health official whether it’s  from patient-hospital encounters stored in a state health information exchange or a primary care setting when a patient presents for treatment or even when such cases are documented at a public health care service location.  

Funded by the Robert Wood Johnson Foundation, the researchers collected data through staff interviews, staff observations, patient focus groups, and staff surveys to create the index with a questionnaire and scoring guide. It’s divided into four IT-based categories: Scale and scope of use; quality; human capital, policy and resources; and community infrastructure.


History Lesson

“The early-2000s SARS epidemic is a good illustrator of information as one of the most critical tools in addressing any type of public health crisis,” says Agarwal. “[Cases] spread like wildfire with more than 8,000 people becoming infected globally over a three-year time frame. “That number may have been substantially lower if information about new cases had been monitored and shared to get an accurate picture of the prevalence and spread of the disease.”

More recently, a Texas Ebola case illustrated a public health worker as unprepared to act on access to a hospital’s electronic health records containing information about the patient’s travels that could have resulted in immediate action, Agarwal says. “But no one paid attention to it.” 

Both cases, collectively, show the importance of “seamless data integration across acute care (hospitals), primary care (clinics and other ambulatory facilities), and public health delivery locations,” she says. “And of course, all of this has to occur while simultaneously maintaining the privacy of pertinent patient data.”

In positive trends, Agarwal says “syndromic surveillance” has been a core aspect of the Meaningful Use standards enforced by the Centers for Medicare and Medicaid Services, and electronic health records adoption is on the rise by as much as 55 percent, according to estimates. “But resource-strapped departments remain unable to utilize the data effectively,” Agarwal says. “We have a long way to go.”

Read More

The Public Health Information Technology Maturity Index: An Approach to Evaluating the Adoption and Use of Public Health Information Technology by Kenyon Crowley, UMD’s Robert H. Smith School of Business; Robert S. Gold, UMD School of Public Health; Sruthi Bandi, UMD’s iSchool; and Ritu Agarwal, UMD’s Robert H. Smith School of Business, appears in the April 2016 issue of Frontiers in Public Health Services and Systems Research.

Forthcoming Conference

CHIDS will host its annual Workshop on Health IT and Economics on Oct. 21-22, 2016, at the Westin Georgetown in Washington, D.C.  The event is designed to deepen the understanding of health IT design and its resultant impact and to stimulate new ideas with both policy and business implications.

Newly Discovered 'Blue Whirl' Fire Tornado Burns Cleaner for Reduced Emissions

August 4, 2016

Melissa L. Andreychek  301-405-0292
Lee Tune 301-405-4679

Findings could lead to cleaner oil spill cleanups

COLLEGE PARK, Md. — Fire tornados, or ‘fire whirls,’ pose a powerful and essentially uncontrollable threat to life, property, and the surrounding environment in large urban and wildland fires. But now, a team of researchers in the University of Maryland’s A. James Clark School of Engineering say their discovery of a type of fire tornado they call a ‘blue whirl’ could lead to beneficial new approaches for reducing carbon emissions and improving oil spill cleanup.

A new paper published online August 4, 2016, in the peer-reviewed journal Proceedings of the National Academy of Sciences (PNAS) describes this previously unobserved flame phenomenon, which burns nearly soot-free.

“Blue whirls evolve from traditional yellow fire whirls. The yellow color is due to radiating soot particles, which form when there is not enough oxygen to burn the fuel completely,” said Elaine Oran, Glenn L. Martin Institute Professor of Engineering and co-author of the paper. “Blue in the whirl indicates there is enough oxygen for complete combustion, which means less or no soot, and is therefore a cleaner burn.”

The Clark School team initially set out to investigate the combustion and burning dynamics of fire whirls on water. What they discovered was a novel, swirling blue flame that they say could help meet the growing worldwide demand for high-efficiency, low-emission combustion.

“A fire tornado has long been seen as this incredibly scary, destructive thing. But, like electricity, can you harness it for good? If we can understand it, then maybe we can control and use it,” said Michael Gollner, assistant professor of fire protection engineering and co-author of the paper.

“This is the first time fire whirls have been studied for their practical applications,” Gollner added.

Some oil spill remediation techniques include corralling up the crude oil to create a thick layer on the water surface that can be burned in place, but the resulting combustion is smoky, inefficient, and incomplete. However, the Clark School researchers say blue whirls could improve remediation-by-combustion approaches by burning the oil layer with increased efficiency, reducing harmful emissions into the atmosphere around it and the ocean beneath it.

“Fire whirls are more efficient than other forms of combustion because they produce drastically increased heating to the surface of fuels, allowing them to burn faster and more completely. In our experiments over water, we’ve seen how the circulation fire whirls generate also helps to pull in fuels. If we can achieve a state akin to the blue whirl at larger scale, we can further reduce airborne emissions for a much cleaner means of spill cleanup,” explained Gollner.

Beyond improvements to fuel efficiency and oil spill remediation, there are currently few easy methods to generate a stable vortex in the lab, so the team hopes their discovery of the ‘blue swirl’ can serve as a natural research platform for the future study of vortices and vortex breakdown in fluid mechanics.

“A fire whirl is usually turbulent, but this blue whirl is very quiet and stable without visible or audible signs of turbulence,” said Huahua Xiao, assistant research scientist in the Clark School's Department of Aerospace Engineering and corresponding author of the paper. “It’s really a very exciting discovery that offers important possibilities both within and outside of the research lab.”

The paper, “From fire whirls to blue whirls and combustion with reduced pollution,” Huahua Xiao, Michael J. Gollner, and Elaine S. Oran, was published August 4, 2016, in the journal Proceedings of the National Academy of Sciences (PNAS).

To access blue whirl photos and videos, visit: http://go.umd.edu/bluewhirl

This work was supported by the National Science Foundation through an EAGER award CBET-1507623 and by the University of Maryland through Minta Martin Endowment Funds in the Department of Aerospace Engineering and the Glenn L. Martin Institute Chaired Professorship at the A. James Clark School of Engineering.


New Quantum Computer Module Sets Stage for General-Purpose Quantum Computers

August 4, 2016

Chris Cesare 301-405-0824
Lee Tune 301-405-4679

COLLEGE PARK, Md. – A team of researchers, led by University of Maryland Physics Professor Christopher Monroe, has introduced the first fully programmable and reconfigurable quantum computer module in a paper published as the cover article in the August 4 issue of the journal Nature. The new finding represents a leap in the field of quantum computers according to experts, and already is drawing significant scientific attention.

The new device, dubbed a module because of its potential to connect with copies of itself, takes advantage of the unique properties offered by trapped ions to run any algorithm—a computer program dedicated to solving a particular problem—on five quantum bits, or qubits—the fundamental unit of information in a quantum computer. Quantum computers promise speedy solutions to some difficult problems, but building large-scale, general-purpose quantum devices is a problem fraught with technical challenges.

To date, many research groups have created small, but functional, quantum computers. By combining a handful of atoms, electrons or superconducting junctions, researchers now regularly demonstrate quantum effects and run simple quantum algorithms.

But these laboratory devices are often hard-wired to run one program or limited to fixed patterns of interactions between their quantum constituents. Making a quantum computer that can run arbitrary algorithms requires the right kind of physical system and a suite of programming tools. Atomic ions (charged atoms), confined by fields from nearby electrodes, are among the most promising platforms for meeting these needs.

“For any computer to be useful, the user should not be required to know what’s inside,” said Monroe, who is also a UMD Distinguished University Professor, the Bice Zorn Professor of Physics, and a fellow of the Joint Quantum Institute and the Joint Center for Quantum Information and Computer Science. “Very few people care what their iPhone is actually doing at the physical level. Our experiment brings high-quality quantum bits up to a higher level of functionality by allowing them to be programmed and reconfigured in software.”

This photograph of an ion trap is by Shantanu Debnath & Emily Edwards of the University of Maryland and the Joint Quantum Institute.The new module builds on decades of research into trapping and controlling ions. It uses standard techniques but also introduces novel methods for control and measurement. This includes manipulating many ions at once using an array of tightly-focused laser beams, as well as dedicated detection channels that watch for the glow of each ion.

“These are the kinds of discoveries that the NSF Physics Frontiers Centers program is intended to enable,” said Jean Cottam Allen, a program director in the National Science Foundation’s physics division. “This work is at the frontier of quantum computing, and it’s helping to lay a foundation and bring practical quantum computing closer to being a reality.”

The Joint Quantum Institute is a research partnership between University of Maryland (UMD) and the National Institute of Standards and Technology, with the support and participation of the Laboratory for Physical Sciences.

The team tested their module on small instances of three problems that quantum computers are known to solve quickly. Having the flexibility to test the module on a variety of problems is a major step forward, said the paper’s lead author Shantanu Debnath, a graduate student at UMD and the Joint Quantum Institute. “By directly connecting any pair of qubits, we can reconfigure the system to implement any algorithm,” Debnath said. “While it’s just five qubits, we know how to apply the same technique to much larger collections.”

At the module’s heart, though, is something that’s not even quantum: A database stores the best shapes for the laser pulses that drive quantum logic gates, the building blocks of quantum algorithms. Those shapes are calculated ahead of time using a regular computer, and the module uses software to translate an algorithm into the pulses in the database.

Putting the pieces together

Every quantum algorithm consists of three basic ingredients. First, the qubits are prepared in a particular state; second, they undergo a sequence of quantum logic gates; and last, a quantum measurement extracts the algorithm’s output.

The module performs these tasks using different colors of laser light. One color prepares the ions using a technique called optical pumping, in which each qubit is illuminated until it sits in the proper quantum energy state. The same laser helps read out the quantum state of each atomic ion at the end of the process. In between, a separate laser strikes the ions to drive quantum logic gates.

These gates are like the switches and transistors that power ordinary computers. Here, lasers push on the ions and couple their internal qubit information to their motion, allowing any two ions in the module to interact via their strong electrical repulsion. Two ions from across the chain notice each other through this electrical interaction, just as raising and releasing one ball in a Newton’s cradle transfers energy to the other side.

The re-configurability of the laser beams is a key advantage, Debnath says. “By reducing an algorithm into a series of laser pulses that push on the appropriate ions, we can reconfigure the wiring between these qubits from the outside,” he said. “It becomes a software problem, and no other quantum computing architecture has this flexibility.”

To test the module, the team ran three different quantum algorithms, including a demonstration of a Quantum Fourier Transform (QFT), which finds how often a given mathematical function repeats. It is a key piece in Shor’s quantum factoring algorithm, which would break some of the most widely-used security standards on the internet if run on a big enough quantum computer.

Two of the algorithms ran successfully more than 90 percent of the time, while the QFT topped out at a 70 percent success rate. The team says that this is due to residual errors in the pulse-shaped gates as well as systematic errors that accumulate over the course of the computation, neither of which appear fundamentally insurmountable. They note that the QFT algorithm requires all possible two-qubit gates and should be among the most complicated quantum calculations.

The team believes that eventually more qubits—perhaps as many as 100—could be added to their quantum computer module. It is also possible to link separate modules together, either by physically moving the ions or by using photons to carry information between them.

Although the module has only five qubits, its flexibility allows for programming quantum algorithms that have never been run before, Debnath said. The researchers are now looking to run algorithms on a module with more qubits, including the demonstration of quantum error correction routines as part of a project funded by the Intelligence Advanced Research Projects Activity.

UMD Names Dr. Keith Marzullo as Dean of College of Information Studies

August 1, 2016

Crystal Brown 301-405-4621

COLLEGE PARK, Md. – The University of Maryland announces the appointment of Dr. Keith Marzullo as Dean of the College of Information Studies (iSchool). Dr. Marzullo officially begins his position on August 1, 2016.

As Dean, Dr. Marzullo will build upon his extensive background in computer science and cybersecurity to elevate the iSchool’s leadership in information management, libraries and archives, and human-computer interaction.

“Dr. Marzullo is an impressive addition to the University of Maryland family," says Mary Ann Rankin, UMD’s senior vice president and provost. “His deep expertise and innovative vision for the iSchool are a tremendous asset to the College of Information Studies and to the greater University.”

Dr. Marzullo will join the university from the White House Office of Science and Technology Policy, where he served as the Director of the Networking and Information Technology Research and Development (NITRD) Program. Prior to that, Dr. Marzullo was at the National Science Foundation (NSF) for five years, where he served as Director of the Division of Computer and Network Systems (CNS) in the Computer and Information Science and Engineering (CISE) Directorate. In this role, he provided leadership in cybersecurity, networking, computer systems, and cyber physical systems. 

“It is a great honor, privilege, and pleasure to join the ranks of the University of Maryland’s iSchool, and to have the opportunity to build upon the college’s recent successes and increasingly stellar reputation,” says Dr. Marzullo. “I look forward to bringing my technological and administrative aptitudes to a college that is helping to meet the world’s burgeoning needs for library and information science, information management, and human-computer interaction advancements.” 

Dr. Marzullo previously held faculty positions at the University of California, San Diego, the University of Tromsø, Norway, and at Cornell University.

Dr. Marzullo received an A.B. in physics from Occidental College, Los Angeles, and a M.S. in applied physics and a Ph.D. in electrical engineering, both from Stanford University. His dissertation research involved development of the Xerox Research Internet Clock Synchronization protocol, one of the first practical fault-tolerant protocols addressing this problem. His more recent research, prior to his administrative service, focused on issues in the foundations of distributed systems and cybersecurity.  

Partnership Between University of Maryland and U.S. Army Research Laboratory Harnesses the Power of Defense Supercomputing to Create Opportunities for Scientific Discovery

July 29, 2016

Katie Lawson, University of Maryland, 301-405-4622
Joyce Martin, U.S. Army Research Laboratory, 301-394-1178

Strategic Alliance Offers Accessible, Enhanced HPC Resources Benefiting Researchers,
Higher Education and National Security 

COLLEGE PARK, MD – The University of Maryland (UMD) and the U.S. Army Research Laboratory (ARL), the central laboratory that provides world-class research for the Army, today announced a strategic partnership to provide high-performance computing (HPC) resources for use in higher education and research communities. 

As a result of this synergistic partnership, students, professors, engineers and researchers will have unprecedented access to technologies that enable scientific discovery and innovation.

The partnership was formed under ARL’s “Open Campus” initiative, which aims to build a science and technology ecosystem. Mid-Atlantic Crossroads (MAX), a University of Maryland center that operates a multi-state advanced cyberinfrastructure platform, will connect ARL’s high-performance computer “Harold” to this ecosystem on its 100-Gbps optical network. Collaborators from the UMD, MAX and ARL communities will be able to build research networks, explore complex problems, engage in competitive research opportunities and encounter realistic research applications.

“The UMD/MAX-ARL partnership provides a unique opportunity for both organizations to create a national model of collaboration in the HPC field,” said Tripti Sinha, MAX Executive Director and UMD Assistant Vice President and Chief Technology Officer. “Collaborative partnerships are key to maximizing our technological potential and ensuring our nation’s strength and competitiveness in the critical fields of science and research. UMD and MAX are very excited to work with ARL on this endeavor.” 

In addition to increasing accessibility and enhancing HPC resources for researchers, the collaboration between UMD/MAX and ARL will also support innovation activities conducted by private and startup companies that connect through MAX’s infrastructure.  

“Our goal is to take the cutting-edge computational power that we use for defense research, development, test, and evaluation and put that in a place that will benefit the wider scientific community,” said Dr. Raju Namburu, Chief, Computational Sciences Division, Computational and Information Sciences Directorate, U.S. Army Research Laboratory.

UMD, MAX, and ARL’s combined effort not only benefits the mid-Atlantic region, but also aligns with the federal government’s strategic initiative to maximize the benefits of supercomputing for economic competitiveness, scientific discovery and national security. An executive order announced in July 2015 established the National Strategic Computing Initiative (NSCI) to support the United States in its efforts to remain a leader in the development and deployment of HPC systems.

“The university is in full support of the federal government’s leadership on this critical HPC initiative,” said Eric Denna, UMD Vice President and Chief Information Officer. “The creation of the UMD/MAX-ARL partnership is just one step in the promotion of HPC innovation, and UMD will continue to actively participate by contributing technical expertise and sharing knowledge with our key collaborators.”

The UMD/MAX-ARL partnership also lays the foundation for the organizations to expand their reach and make additional HPC resources accessible to the communities they serve.

Harold will become available once the machine is scrubbed, declassified and brought into ARL’s demilitarized zone, or perimeter network. Under ARL and UMD’s collaborative research development agreement (CRADA), the HPC resource will be allocated to MAX’s Internet Protocol (IP) address space and will be accessible to the collective communities of UMD, MAX and ARL’s Open Campus. As a result, researchers will have supercomputing-caliber computational capability and leading-edge advanced networking research at their fingertips that is designed for application development and networking experiments. 

“This joint research venture with UMD/MAX will leverage ARL’s high-performance resources and the Army's groundbreaking research programs in emerging scientific computing architectures, such as non Von Neumann computing architectures, distributed ad-hoc computing and programmable networks,” Namburu said. “The result is a unique opportunity for synergistic collaboration between two prominent organizations on the forefront of research and innovation.” 

The ultimate goal is to share HPC resources for the good of the community and ensure that groundbreaking collaborative projects have the necessary tools.

“An HPC resource like Harold will significantly enhance the capabilities of the University of Maryland’s faculty and student researchers,” said Patrick O’Shea, UMD Vice President and Chief Research Officer. “The partnership between UMD/MAX and ARL opens up connections for our community and enables research opportunities. We are eager to see the expansion of our creative ecosystem.”

About University of Maryland (UMD)

The University of Maryland is the state's flagship university and one of the nation's preeminent public research universities. A global leader in research, entrepreneurship and innovation, the university is home to more than 37,000 students, 9,000 faculty and staff, and 250 academic programs. Its faculty includes three Nobel laureates, three Pulitzer Prize winners, 56 members of the national academies, and scores of Fulbright scholars. The institution has a $1.8 billion operating budget and secures $550 million annually in external research funding. For more information about the University of Maryland, visit www.umd.edu.   

About Mid-Atlantic Crossroads (MAX)

Mid-Atlantic Crossroads (MAX) is a center at the University of Maryland that operates a multi-state advanced cyberinfrastructure platform. MAX’s all-optical, Layer 1 core network is the foundation for a high-performance infrastructure providing state-of-the-art 100-Gbps network technology and services. MAX participants include universities, federal research labs, and other research-focused organizations 

in the Washington and Baltimore metropolitan areas. MAX serves as a connector and traffic aggregator to the Internet2 national backbone and peers with other major networks. Its mission is to provide cutting-edge network connectivity for its participants, tailored and generic data-transport solutions, and advanced services to accommodate and optimize large data flows and to facilitate network and application research. For more information about MAX and MAX services, please visit www.maxgigapop.net.

About U.S. Army Research Laboratory (ARL)

The U.S. Army Research Laboratory is part of the U.S. Army Research, Development and Engineering Command, which has the mission to ensure decisive overmatch for unified land operations to empower the Army, the joint warfighter and our nation. RDECOM is a major subordinate command of the U.S. Army Materiel Command.

UMD Team Discovers Insight into the 'Language' Animals Use to Keep Cells Identical

July 26, 2016

Matthew Wright 301-405-9267
Lee Tune 201-405-4679

Biologists and computer scientists used machine translation software to yield new understanding with potential insights into some cancers and age-related diseases 

COLLEGE PARK, Md. – All animals begin life as a single cell from which arise the many different cell types, such as heart, lung, blood, etc., that are specific to that type of animal. However, once the process of cell differentiation has led to many different tissues, each organism has a new, opposite imperative – keeping new cells in each type of tissue the same as their brethren. Cancers arise in a tissue when a cell becomes different from its neighbors and thus represent a failure to maintain this critical uniformity.

Biologists have long known a great deal about how cells differentiate into various tissues during development. However, little has been known about how organisms continually maintain a population of identical cells in each tissue over an animal’s entire lifetime. Now, a University of Maryland research team has discovered that a regulatory protein named ERI-1 helps ensure that all cells in a tissue remain identical to one another. 

The work involved collaboration between developmental biologists and computer scientists, with the latter contributing their expertise with machine learning analysis that they typically use for computer language translation. The finding could bring biologists one step closer to understanding some cancers and other age-related diseases.

Roundworms (Caenorhabditis elegans) with a disabled eri-1 gene can lose their ability to control repetitive DNA. In the absence of eri-1, even two age-matched siblings can look dramatically different. These differences are because of variable expression from high-copy DNA (green) but not from low-copy DNA (magenta) in the worms’ intestinal cells. In worms with a functional eri-1 gene, even high-copy DNA is expressed uniformly in all animals. Image credit: Antony Jose The study, which is the first of its kind conducted in a whole animal (the roundworm Caenorhabditis elegans) instead of cultured cells, appears in the August 1, 2016 issue of the Journal of Cell Biology. The researchers’ approach reveals one important mechanism that animals use to maintain uniform patterns of gene expression. The team’s use of machine learning software proved essential for quickly and clearly identifying complex patterns in the data.

“Cells can look the same and behave the same, but how? The liver is full of liver cells and doesn’t have any heart cells, for example. There’s so much that needs to happen to maintain a tissue,” said Antony Jose, an assistant professor in the UMD Department of Cell Biology and Molecular Genetics and senior author on the study. “It’s a fundamental question that’s been hiding in plain sight. We’ve now proposed an answer that could help advance our understanding of age-related diseases.”

The results suggest that long sections of repetitive DNA can be read differently from cell to cell. The researchers found that, in healthy tissues, ERI-1 normalizes these differences by ensuring that each cell expresses their genes at the same levels. When the researchers turned off the gene that produces ERI-1 in C. elegans, an abnormal patchwork of gene expression appeared in the worms’ intestines.

“To understand these processes, we needed to measure single-cell differences in a whole animal,” Jose said. “We had to know which cell was related to which others and simultaneously measure various properties in all cells within a tissue. Technically, achieving this is very difficult. But you can’t adequately answer these questions outside the context of the whole animal.”

To achieve this complex analysis, Jose and his colleagues formed an unexpected collaboration. Lead author Hai Le (B.S. ’13, biological science), an undergraduate researcher in Jose’s lab who is now a student at the Johns Hopkins University School of Medicine, presented a poster at UMD’s Bioscience Day conference in 2012. Future collaborator Michael Bloodgood, an associate research scientist at UMD’s Center for Advanced Study of Language, stopped by to discuss Le’s work. The two researchers quickly recognized the potential for machine learning to help facilitate Le and Jose’s analysis. 

“Linguists use machine learning to compare blocks of text to identify nouns and verbs, analyze sentence structures and determine average word length, for example,” Jose said. “Hai and Michael recognized that we could use the same technique to analyze gene expression in intestinal cells.”

Machine learning software can reveal complex patterns that the human eye cannot see. As the name implies, the software can be taught to look for specific patterns and can also “learn” from experience, becoming more efficient with each subsequent analysis. Using this approach enabled the researchers to quickly make objective comparisons that would have been all but impossible using other methods.

The researchers chose C. elegans because it is a simple organism that can easily be studied at the level of a single cell while it is still alive. Jose notes that their technique is broadly applicable, and could be modified to work with other genes and different tissues as well. If others adopt the team’s whole-animal methodology, Jose believes it could signal a shift in the way cell biologists approach their experimental design.

“The effect of cancer drugs is often examined in cultured cells. Our work suggests that studies on cells outside an animal could miss many things. For example, cultured cells can show differences in gene expression that are eliminated in a whole animal,” Jose said. “I believe our results could lead to some shifts in thinking about how to imitate a whole animal in cell culture conditions.”

In addition to Jose, Le and Bloodgood, authors on the study include Monika Looney (B.S. ’16, biological science, psychology) and Benjamin Strauss (B.S. ’12, computer engineering) who were undergraduate researchers when the study was conducted. 

The research paper, “Tissue homogeneity requires inhibition of unequal gene silencing during development,” Hai Le, Monika Looney, Benjamin Strauss, Michael Bloodgood and Antony Jose, appears in the August 1, 2016 print edition of the Journal of Cell Biology.

This work was supported by the Howard Hughes Medical Institute and the National Institutes of Health (Award Nos. R00GM085200 and R01GM111457). The content of this article does not necessarily reflect the views of these organizations.


January 20
Colwell honored for a career of innovation and discovery, including work to understand the bacterium that causes... Read
January 20
Just published book outlines detailed analysis of likely results of meeting targets laid out in the agreement Read
January 17
UMD researchers measure global loss of wild forest landscapes Read
January 10
Scientists describe evolution of corn earworm’s resistance to a long-effective pest management biotechnology Read