Guest speakers from the company Areté
On Thursday, July 1, 2021, EECS Professor James Plank will be hosting two guest speakers from the company Areté (www.arete.com):
Dr. Timothy Klein, Physicist
Mr. Travis Shrontz, Mathematician
"Areté is an advanced science and engineering company that provides innovative solutions — from scientific discovery through production. Areté’s smart systems include active and passive sensors, real-time processing, software, and complex algorithms that operate from seafloor to space.”
Klein and Shrontz will be giving two talks in Min H. Kao room MK435, and then ADVANCE Associate Professor Nicole McFarlane, Assistant Professor Ahmed Aziz and Professor James Plank will be giving talks afterwards on parts of their research.
- 1:30-2:00pm: Arete: The ALIEN Network
- 2:00-2:30pm: Arete: FogViewer
- 2:30-3:15pm: ADVANCE Associate Professor Nicole McFarlane: Bio-Sensor Design
- 3:30-4:15pm: Assistant Professor Ahmed Aziz: AI and Neuromorphic Hardware
- 4:15-5:00pm: Professor James Plank: TENNLab Research on Neuromorphic Computing Systems
Abstract of Dr. Timothy Klein’s and Mr. Travis Shrontz’s talks about The Alien Network and FogViewer
The ALIEN Network
Over the past several years, many advances have been made in applying Deep Learning via neural networks to image processing problems, specifically, problems involving target detection, localization, and classification. Leveraging the work that has been done in this area has shown some success in reducing the amount of work an analyst must do, which makes High Altitude-Wide Area Motion Imaging (HA-WAMI) systems ultimately more efficient, accurate, and safe. However, there is no published deep learning algorithm to date that is able to efficiently and simultaneously detect and localize targets, while also extracting features of each target (such as orientation) within a matter of seconds. In order to address this, Arete has designed an efficient and robust deep learning algorithm, based on classical detection theory: the Attribute Localization and Instance Extraction Neural (ALIEN) Network. This network is able to detect, localize, and characterize a large and variable number of targets in a scene within seconds with a high degree of accuracy. When we pair this network with the novel use of masking functions to remove non-target inferences from the overall network loss, we obtain a deep-learning network that performs each of the three operations of (1) detect, (2) localize, and (3) characterize in a single forward pass. In this presentation we describe the basic architecture of this network and summarize its performance when applied to publicly available datasets such as Cars Overhead with Context (COWC).
The ALIEN network is designed to detect, localize, and characterize objects within scenes that contain a large number of relatively small-sized targets. By contrast, the object detection methods within the current state of the art of image processing were designed primarily to detect a small number of relatively large-sized targets. The ALIEN network addresses the challenge of target detection in a small-object domain in a novel way. Overall, it can be classed as a grid-based regression algorithm that uses x; y pixel estimations. What chiefly sets ALIEN apart from the current state of the art is its use of masking functions, which facilitate multiple and simultaneous inferences because only those inferences which correspond to true targets are included in the overall network loss. The details of this are described below. The ALIEN network operates on image-cells of a preassigned pixel dimension. Within each cell, a fixed number of anchor-points are arrayed, representing the maximum number of targets possible within a cell. Inferences are made at each anchor-point which, in effect, transforms the problem from one where the total number of possible targets is variable to one where the total number of possible targets is fixed. However, the first inference at each anchor-point is the probability that there is a target present in the vicinity of that anchor-point. In this way only the inferences that correspond to anchor-points where a target is present in the truth-data are included in the total loss and the final output.
Areté’s FogViewer is an image based meteorological visibility measurement system which utilizes low-cost COTS hardware and innovative, advanced image processing and fusion techniques to provide port operators and shipboard personnel with a clear picture of their local environment. FogViewer is small, light, and consumes little power. With hardware regularly used in marine environments, FogViewer will withstand harsh conditions with low maintenance and operation costs.
FogViewer has been matured to a TRL 5 prototype multi-model EO/IR system with jointly optimized sensing and processing capabilities for detecting and characterizing visibility conditions. The system achieves visibility prediction across varied conditions and has been demonstrated in real-world scenarios with low-visibility conditions. The FogViewer system has been tested and integrated with NOAA communications equipment, is equipped with automated setup of new systems, has full algorithm integration on Jetson hardware, includes a weather resistant enclosure for deployment operationally, and has undergone multiple extended/unsupervised deployments for test.
Abstract of Dr. James Plank’s talk:
TENNLab is a collaboration of researchers at the University of Tennessee, Oak Ridge National Laboratory and SUNY Polytech who focus on Neuromorphic computing at all levels of the computing stack, including devices, circuits, architectures, systems and algorithms. In this talk, Jim Plank will summarize work of the TENNLab PI's over the past few years, focusing primarily on the efforts in software and algorithms. Highlights include:
- Fabrication of neuromorphic chips with hafnium dioxide memristors at the SUNY foundry.
- Circuit and system design of memristor-based neuromorphic processors
- A software framework supporting simultaneous research on neuromorphic processors, applications and learning techniques.
- Real-time neuromorphic control applications.
- The effectiveness of various value-to-spike encoding techniques.
- Trading space for time in converting deep convolutional neural networks to spiking neuromorphic systems.
Thursday, July 1 at 1:30pm to 5:00pm
Min H. Kao Electrical Engineering and Computer Science, 435
1520 Middle Drive, Knoxville, TN 37996