Second Big Data workshop fosters connections across disciplines at Texas A&M University

The second annual Big Data workshop was held recently at Texas A&M University to foster connections across disciplines that intersect this area and help people to continue to identify opportunities for collaboration. The workshop was comprised of 27 short talks from speakers from across the university, organized in thematic sessions with time for discussion. The sessions encompassed Big Data in Sensing and Social Applications; Environment, Resources and Power; Materials; Cybersecurity; and Bioinformatics, Medicine & Health Sciences. Participants also discussed broader issues for big data research in the university, including infrastructure support, computational resources and availability of data for collaboration. There were over 90 registered attendees.

Many researchers across Texas A&M have current or emerging research interests in big data methods, systems or applications, and there are currently opportunities at the federal level for major funding of cross-disciplinary projects in data science.

Building on the first workshop held last year, Dr. Nick Duffield, professor in the Department of Electrical and Computer Engineering and director of the Texas A&M Engineering Big Data Initiative, and Dr. Dilma Da Silva, head of the Department of Computer Science and Engineering, organized the workshop to continue to build community amongst big data researchers at Texas A&M.

Since the first workshop, interdisciplinary teams from Texas A&M have submitted proposals for funding opportunities including the NSF BIGDATA and NSF Big Data Spokes programs. In order to help researchers better position themselves for these and other opportunities, the Texas A&M Engineering Experiment Station (TEES), in partnership with the Texas A&M Division of Research, Texas A&M AgriLife Research, Texas A&M Health Science Center (TAMHSC) and the Texas A&M Transportation Institute (TTI), has awarded nearly $350,000 in seed grant funding to seven interdisciplinary research teams for big data.

“Texas A&M is positioned to lead in applications of big data in its disciplinary areas of strength, not only in research, but by leveraging its network of cross-sector partnerships to realize the benefits of big data applications more widely,” said Duffield. Texas A&M will host a conference on “Advances in Big Data Modeling, Computation and Analytics,” on Sept.22-24, which will feature leading researchers and practitioners in the field.

The 2016 workshop program, including slides for some presentations, can be found at

Lin wins Best Paper Award at prestigious conference

Honghuang Lin, a Ph.D. student in the Department of Electrical and Computer Engineering at Texas A&M University, received a third-place Best Paper Award at the premier conference for the functional design and verification of electronic systems.

Lin, who is in the computer engineering and systems group, his advisor is Dr. Peng Li. He received the award at The Design and Verification Conference and Exhibition U.S. (DVCon U.S.) for his paper, “Functional Coverage Collection for Analog Circuits – Enabling Seamless Collaboration Between Design and Verification,” which is co-authored with engineers from Texas Instruments Inc. The award is voted on by all the participants of the conference.

Lin’s paper addresses the challenges in analog verification coverage, which is an increasingly important topic in the industry, by the proposed Analog Coverage Checking (ACC) framework. It’s the first work to apply functional coverage metrics on analog circuits in a mixed signal simulation environment, which utilizes analog functional checkers deep inside the schematic hierarchy. In addition, it provides a mechanism for seamless collaborations between analog designers and verification engineers, by unifying the definitions of analog functional descriptions in the chip level for analog and mixed signal designs, and the construction of meaningful analog covergroup verification at the top level. It’s also an efficient way to track the verification progress. The proposed ACC framework serves as an efficient tool to embed analog checking in the early design stage and to conveniently track the verification progress.

Lin received his bachelor’s degree in automation from Tsinghua University, China in 2011. His research interests include analog and mixed-signal circuit verification, machine learning based circuit analysis and circuit modeling.

DVCon is the premier conference for discussion of the functional design and verification of electronic systems. DVCon is sponsored by Accellera Systems Initiative, an independent, not-for-profit organization dedicated to creating design and verification standards required by systems, semiconductor, intellectual property (IP) and electronic design automation (EDA) companies. In response to global interest, in addition to DVCon U.S., Accellera also sponsors DVCon Europe and DVCon India. For more information about Accellera, please visit For more information about DVCon U.S., please visit

Three computer engineering professors named IEEE Fellow

Three professors in the Department of Electrical and Computer Engineering at Texas A&M University were named Fellows of the Institute of Electrical and Electronic Engineers (IEEE). Dr. Jiang Hu, Dr. Peng Li and Dr. Xi Zhang were named IEEE Fellows for their research contributions.

IEEE Fellow is the highest grade of membership and is recognized by the technical community as a prestigious honor and an important career achievement. The IEEE grade of Fellow is conferred by the IEEE board of directors upon a person with an outstanding record of accomplishments in any of the IEEE fields of interest. The total number selected in any one year cannot exceed one-tenth of one percent of the total voting membership.

Hu was elected for contributions to gate, interconnect and clock network optimization in VLSI circuits, Li was elected for contributions to the analysis and modeling of integrated circuits and systems and Zhang was elected for his contributions to quality of service (QoS) in mobile wireless networks.

Dr. Jiang Hu

Digital VLSI chips, such as microprocessors and video decoders, are mostly composed by logic gates, which are connected by interconnect wires and synchronized by clock networks. Hu’s research accomplishments encompass all these three key elements. For gate optimizations he is a main contributor to the state-of-the-art solutions that address industrial challenges in nanometer VLSI technologies, including competing design objectives, complex models, non-ideal effects and huge problem sizes. Interconnect is a critical bottleneck to digital chip performance.

On interconnect optimization, Hu has produced large impact in both academia and industry. His research results have been applied on many industrial chip products, facilitating better chip performance, less chip power, shorter design turn-around time and solving difficult design cases. Hu is also highly recognized for his research on VLSI clock network optimization. Among many contributions, he pioneered the concept of cross-link, which greatly enhances clock network robustness with very high energy-efficiency, and inspired numerous follow-up research activities. Hu’s overall achievement is instrumental in shaping the course of VLSI optimization research and helping the VLSI industry tackle real world challenges.

Dr. Peng Li

Li obtained his Ph. D. in electrical and computer engineering from Carnegie Mellon University and joined the department in 2004. He has established expertise in electronic design automation, integrated circuits and systems, brain-inspired computing and aspects of computational neuroscience. In addition to his elevation to IEEE Fellow, his work has been recognized by various distinctions including four best paper awards from prestigious VLSI and EDA conferences, an NSF Career Award, four Inventor Recognition Awards from the Microelectronics Advanced Research Corporation and the Semiconductor Research Corporation. Li received the Best Paper Hat Trick Award, Prolific Author Award and Top 10 Author in Fifth Decade Award, all from the IEEE/ACM Design Automation Conference, the world’s premier VLSI technology conference.

Li’s former associates have obtained faculty and research positions in academia and industrial labs (Michigan Tech, Cornell Medical College/Cornell University, Intel Strategic CAD Laboratories) and research and development positions in the United States high-tech industry. He has brought his work to the real world through technology transfer and consulting for major semiconductor firms and startups.

Dr. Xi Zhang

Zhang, director of the Networking and Information Systems Laboratory, joined the department in 2002. He received his Ph.D. in electrical engineering and computer science (electrical engineering – systems) from The University of Michigan. He was a research fellow with the School of Electrical Engineering, University of Technology, Sydney, Australia, and the Department of Electrical and Computer Engineering, James Cook University, Australia. He was with the Networks and Distributed Systems Research Department, AT&T Bell Laboratories, Murray Hills, New Jersey, and AT&T Laboratories Research, Florham Park, New Jersey.

Zhang has published more than 300 research papers, two books and multiple book chapters on mobile wireless networks, statistical delay-bounded QoS guarantee for multimedia wireless networks, 5G mobile wireless networks, wireless cognitive radio networks, wireless sensor networks, underwater wireless networks, network protocol design and modeling, statistical communications, random signal processing, information theory and control theory and systems. His publications have been extensively cited in the research community.

He received the National Science Foundation CAREER Award in 2004 for his research in the areas of mobile wireless and multicast networking and systems. He is an IEEE Distinguished Lecturer for the IEEE Communications Society and IEEE Vehicular Technology Society. He received Best Paper Awards at IEEE GLOBECOM 2014, IEEE GLOBECOM 2009, IEEE GLOBECOM 2007 and IEEE WCNC 2010. Zhang is author of an IEEE BEST READINGS (receiving the top citation rate) journal paper. He also received a TEES Select Young Faculty Award for Excellence in Research Performance from the Dwight Look College of Engineering at Texas A&M in 2006.

He is serving as, or has been editor for numerous IEEE Transactions and Journals, including IEEE Transactions on Communications, IEEE Transactions on Wireless Communications, IEEE Transactions on Vehicular Technology, IEEE Journal on Selected Areas in Communications, IEEE Communications Letters, IEEE Communications Magazine and IEEE Wireless Communications Magazine. He has served as the technical program (TPC) chair for IEEE GLOBECOM 2011, TPC vice-chair for IEEE INFOCOM 2010, TPC area chair for IEEE INFOCOM 2012, Panel/Demo/Poster chair for ACM MobiCom 2011 and general vice-chair for IEEE WCNC 2013, etc.

Duffield receives DARPA grant for research on network resilience

Dr. Nick Duffield, a professor in the Department of Electrical and Computer Engineering at Texas A&M University, and professor by courtesy in the Department of Computer Science and Engineering, is part of a group that was awarded a multi-million dollar contract from the Defense Advanced Research Projects Agency (DARPA) to help develop new networking and security technologies at the Wide Area Network (WAN) edge.

The awards fall under DARPA’s Edge-Directed Cyber Technologies for Reliable Mission or Edge-CT program that the agency says will combine real- time network analytics, holistic decision systems and dynamically configurable protocol stacks to mitigate WAN failures and attacks on the fly. Its objective is to bolster the resilience of communication over Internet Protocol networks solely by instantiating new capabilities in computing devices within user enclaves at the WAN edge.

The project is led by Applied Communication Sciences with partnership from Apogee Research, the Massachusetts Institute of Technology, the University of Pennsylvania and Texas A&M University, where Duffield is principal investigator. The partners propose to develop Distributed Enclave Defense Using Configurable Edges (DEDUCE). DEDUCE is a new architectural approach to edge-directed network adaptation that incorporates novel approaches to sensing, actuation and control, creating a robust and scalable system that exceeds Edge-CT goals and evolves in response to changes in the network.

Duffield’s involvement in the project stems from his research in Network Tomography, in which end–to-end performance measurements between network edges can be correlated to identify common origins of performance degradation. In DEDUCE, this information will be used to inform strategies for alternate routing on an overlay network between enclaves. Duffield was a co-recipient of the ACM SIGMETRCIS Test of Time Award in both 2012 and 2013 for work in Network Tomography.

Duffield received his bachelor’s degree in natural sciences in 1982 and a master’s in 1983 from the University of Cambridge, UK. He received his Ph.D. in mathematical physics from the University of London, U.K., in 1987. His research focuses on data and network science, particularly applications of probability, statistics, algorithms and machine learning to the acquisition, management and analysis of large datasets in communications networks and beyond.

Before joining the department, Duffield worked at AT&T Labs-Research, Florham Park, New Jersey, where he held the position of distinguished member of technical staff and was an AT&T Fellow. He previously held post-doctoral and faculty positions in Dublin, Ireland and Heidelberg, Germany.

Duffield, the author of numerous papers and holder of many patents, is co-inventor of the smart sampling technologies that lie at the heart of AT&T’s scalable Traffic Analysis Service. He is specialty editor-in-chief of journal Frontiers in ICT and he was charter chair of the IETF working group on packet sampling. Duffield is an IEEE Fellow and serves on the Board of Directors of ACM SIGMETRICS. He is an associate member of the Oxford-Man Institute of Quantitative Finance.

Ponniah and Kumar publish monograph on designing secure protocols for wireless ad-hoc networks

Jonathan Ponniah and P. R. Kumar, in the Department of Electrical and Computer Engineering at Texas A&M University, and a co-author Yih-Chun Hu, have published a monograph on designing secure protocols with provable security guarantees for wireless ad-hoc networks infiltrated with adversarial nodes. The monograph is titled “A Clean Slate Approach to Secure Wireless Networking,”

The authors note that the current process of designing secure protocols is tantamount to an arms race between attacks and “patches” that does not provide any security guarantees. Motivated by this, they introduce a system theoretic approach to the design of secure protocols with provable security as well as optimality guarantees.

Ponniah is a post-doc who completed his PhD under the advise of Kumar who is a university distinguished professor.

CESG Fishbowl Tele-Seminar: Usage-Generated Applications

Last week, Dr. Qiong Wang gave a talk on Usage-Generated Applications and their impact on preserving Best-Effort service on the internet in the presence of Managed Service. This presentation discussed issues in topics of net neutrality, the marketing strategies of internet service providers, and quality of service (QoS) analysis.

Best-Effort service describes a network service where the user is not guaranteed that data is always being delivered, or a certain QoS level or priority. This service is often unreliable, as delivery not always guaranteed. This service has been provided under a low subscription fee and free usage. Due to this dynamic, Best-Effort service has contributed to the growth of the internet and the creation of many network applications. With the discussion of Net Neutrality, a topic has come up discussing the concern of preserving the Internet as it is, with ISP’s providing Managed Service by restricting bandwidth and usage by Best-Effort users.

Dr. Wang, along with Dr. Debasis Mitra, developed a model analysis of the result of the scenario proposed. The model features a monopoly ISP that offers both Best-Effort service for free and Managed Service which guarantees QoS for a per-use fee. Customers make optimal choices regarding whether to subscribe to the network, which service to use, and how much to use the chosen service. The ISP manages fees and bandwidth for both services, working to maximize profit. This analysis shows the need for Usage-Generated Application, which stabilize the offering of Best-Effort Service, specifically in the presence of ISP’s who look to maximize profits with Managed Service.

Dr. Qiong Wang received his PhD in Engineering and Public Policy from Carnegie-Mellon University. He has worked at Alcatel-Lucent Bell Labs as a Member of Technical Staff. He is currently an associate professor at the Department of Industrial and Enterprise Systems Engineering in the University of Illinois at Urbana-Champaign. His research focuses on stochastic control of manufacturing and network economics.

Cyber-Physical Systems: Applications and Challenges

On Tuesday, March 31st, a regional meeting of the National Academy of Engineering was held here at Texas A&M University’s Annenberg Presidential Conference Center. This general symposium had featured speakers who discussed the topic of cyber-physical systems. The symposium addressed the potential benefits these new systems can have for society, the economy, and the environment.

A cyber-physical system (or CPS) is a system of computer elements controlling certain physical entities, heavily interacting with each other. Today, most CPS elements are referred to as embedded systems. However, embedded systems are focused more on computational elements, rather than the link between those and the physical world. Traditionally, a CPS is designed as a network rather than standalone devices. These are closely tied to robotics and sensors. CPS can be found nearly anywhere, including medicine, automobiles, power grids, city infrastructure, manufacturing, aircraft, and building systems. These systems experience increased adaptability, autonomy, efficiency, functionality, reliability, safety, and usability. This symposium was primarily focused on discussing these systems and their integration with computing, communication, and control technologies.

Speaker featured at this symposium include Dr. P.R. Kumar, a professor here at Texas A&M at the Department of Electrical and Computer Engineering; Dr. John Stankovic, BP America Professor at the Department of Computer Science at the University of Virginia; Dr. Vijay Kumar, UPS foundation professor at the University of Pennsylvania, working in Mechanical Engineering and Applied Mechanics, Computer and Information Science, and Electrical and Systems Engineering; and David Corman, National Science Foundation program director, division of Computer and Network Systems.

CESG Seminar: Genomic analysis tools for familial and case-control sequencing studies

Earlier this week, Tuesday April 7th, Dr. Chad Huff gave a talk on genomic analysis tools. He explained how academic efforts are becoming more focused on data analysis and interpretation as genomic data becomes more commoditized. Dr. Huff introduced tools his research group developed to analyze bioinformatics throughput sequencing data.

Previously, traditional genetic analysis tools are usually sub-par in large studies because of problems arising from low power and scalability. The topics Dr. Huff discussed in this talk include relationship estimation, pedigree reconstruction, functional variant prediction, and analysis of rare variants. He discussed a method for detecting genetic relationships called Estimation of Recent Shared Ancestry. This method can identify relatives as distant as 4th cousins, aiding in reconstruction of extended pedigrees. Another tool Dr. Huff presented is the Variant Annotation, Analysis, and Search Tool. This tool was described as a probabilistic disease gene-finder that combines amino acid substitution and allele frequency information. This tool was then extended to find genetic diseases in pedigrees

Dr. Chad Huff is an assistant professor at MD Anderson Cancer Center. His lab’s research is focused on human evolution and disease through statistical, computational, and population genomics. Currently his group is committed to finding new methods to analyze genomic data, and applying them to identifying genetic basis for human diseases, cancer in particular.

Texas A&M Workshop on Software Defined Networks

Last week, a workshop on Software Defined Networking was hosted at Texas A&M at the Emerging Technologies Building. Organized by Dr. Alex Sprintson, a professor here at Texas A&M ECE and Jasson Casey, Texas A&M PhD candidate, founder and executive director of

This workshop consisted of many talks encompassing several aspects of software defined networking. These included topics such as SDN security, architecture, data planes, abstractions, and research. The talks were primarily presented by members of, an organization focused on the OpenFlow protocol standard of SDN. is a coalition of researchers and industry engineers looking to widen the breadth of influence SDN has on the industry. The members of are looking to implement the OpenFlow protocol with a secure message layer, this is called the SDN stack. is also a research organization and thus seeks to further improve key aspects of SDN implementation and merging the gap between industry and academia.

This workshop also consisted of a talk by Chip Howes, an industry veteran, about internet startups. The talk focused on the startup experience, how startups rise and fall, and what it takes to make a successful startup. Mr. Howes is an industry expert and has created and sold six different successful startup companies. He has held many prominent positions in a wide range on industry leaders for over 30 years.

Special CESG Seminar – Maple: Simplifying SDN Programming Using Algorithmic Policies

Last week, Dr. Andreas Voellmy gave a talk about software defined networking (SDN). Dr. Voellmy gave an overview of SDN and presented several challenges with OpenFlow, a standard for SDN. He also presented Maple, which addresses these challenges

A recent development in networking, Software Defined Networking allows a network to make changes to its behavior through a central policy administered by a network controller. While previously, network architecture consisted of fixed, closed, vertically-integrated network appliances, SDN implements a more general packet processing approach, programmed through open control software executed on servers. This implementation is fairly open and very flexible. One standard for SDN is OpenFlow, which defines certain rules and guidelines for how SDN should be implemented. Many aspects of this implementation remain challenging, however.

To address these challenges, Dr. Voellmy presented Maple. Maple allows the user to create algorithmic policies, algorithms programmed in some general-purpose language and run on every packet of data that enters a network. These algorithmic policies replace the requirement of SDN to generate and maintain sets of rules on individual network switches. To implement these policies, Maple has a tracing runtime system which discovers reusable forwarding decisions from a control program.

Dr. Andreas Voellmy received his PhD in Computer Science from Yale University. His research focuses on Software Defined Networking, where he mainly draws on the OpenFlow library implementation, and the Glasgow Haskell Compiler.