During my work with the Celadon Research Division of Ellumen Inc., I was a co-inventor on a patent titled “Microwave Imaging Device” that recently issued on January 16, 2018. This is the fourth patent I have been a co-inventor on. If you are looking for more details of my prior three patents see the post titled “Description of Three Patents Named Co-Inventor On Assigned to Ellumen Inc.” All of these patents were granted by the United States Patent and Trademark Office (USPTO) and currently assigned to Ellumen Inc. I wanted to describe more of the details of the “Microwave Imaging Device” patent.

The “Microwave Imaging Device” patent resulted from wanting an automatic way to acquire microwave imaging data pertaining to some object and/or body part from both a movable transmitting and receiving antenna. In addition, there was desire to be able to collect not just 2D data but also 3D data and also acquire the surface information of what was placed inside the scanner. To accomplish this, a system was built that: 1) contained an object support to hold an object on, 2) contained a transmitter antenna, 3) contained a receiver antenna, 4) had both an inner and outer ring where either the transmitter or receiver was mounted on, 5) contained a controller to independently rotate both the inner and outer ring, 6) contained a computation processor to receive the collected data, and in one embodiment 7) contained a controller to move the object support up and down, and 8) contained an object surface position sensor mounted to either the inner or outer ring to collect the surface of the object. It is important to note that the inner and outer ring are concentric to each other but have different radii. In some embodiments, gears, pinions, and motors are used to help rotate the inner and outer rings, while a feedback monitor can determine if any potential mismatch in positioning occurs. The system further allows for the object surface position data to be used as a seed in the reconstruction of an image represented in dielectric values. In one embodiment, stored data of a prior image reconstruction that closely matches data of the object is used in combination with surface position data as a seed in the reconstruction. The patent also allows for the transmitter and receiver antenna to be mounted in such a way that they can radially translate to and from the center of the device. In addition, the patent covers some aspects of the controller and its module including positions to move both the transmitter and receiver antenna to, the names and locations of the collected data for storage, any necessary instrument parameters, and a calibration of the initial positions of the transmitter and receiver antenna.

The Celadon Research Division of Ellumen Inc. built a prototype of the robotic microwave imaging device as described in the patent that communicates with laboratory instruments (arbitrary waveform generator, oscilloscope, and vector network analyzer) and an infrared sensor and acquires data at different positions for the transmitting and receiving antenna and sensor. I helped program instrument commands to talk to the laboratory instruments using Virtual Instrument Software Architecture (VISA) to automatically acquire data. I collaborated on development of the graphical user interface (GUI) using VB.NET, MATLAB, and a dynamic-link library (DLL). The device can collect data in both the time and frequency domains and be operated remotely with monitoring by a camera. I helped collect data and programmed code to process the data including quickly loading in many data sets, plotting the data, performing analysis, and performing surface reconstruction. I also helped program and generate image reconstruction results from the data collected by the device. The Celadon Research Division of Ellumen Inc., presented a discussion of the device and imaging results in the journal publication IEEE Transactions on Microwave Theory and Techniques and at the IEEE AP-S Symposium on Antennas and Propagation and USNC-URSI Radio Science Meeting in San Diego, CA, in July 2017. See the paper titled paper titled “A Phase Confocal Method for Near-Field Microwave Imaging” and the paper of the poster presentation titled “Experimental Microwave Near-field Detection with Moveable Antennas” for some additional details. I was a co-author on the published paper and helped participate in the presentation. A few photos from the conference in San Diego were previously published in the post titled “IEEE AP-S Symposium on Antennas and Propagation and USNC-URSI Radio Science Meeting in San Diego, CA, in July 2017.”

It is exciting to work on new technology and devices that can have a real impact on the health of patients. Below is a patent certificate that was created to celebrate the accomplishment of having the patent granted.

Microwave Imaging Device Patent

During my work with the Celadon Research Division of Ellumen Inc., I have had three patents that I was a co-inventor on issue to date. The first patent was issued in August 2015, titled “Dielectric Encoding of Medical Images.” The second patent was issued in July 2016, titled “Distributed Microwave Image Processing System and Method.” The third patent was issued in July 2017, also titled “Dielectric Encoding of Medical Images.” In addition, a fourth patent titled “Microwave Imaging Device” is expected to issue later this month in January 2018, that I am also a co-inventor on. All of these patents were granted by the United States Patent and Trademark Office (USPTO) and currently assigned to Ellumen Inc. I wanted to provide a brief discussion of the first three issued patents.

The first and third patents titled “Dielectric Encoding of Medical Images” resulted from wanting a way to allow for doctors to easily read and understand images produced using electromagnetics represented in dielectric values. To accomplish this I worked with the chief executive officer (CEO) of Ellumen Inc. to explore the microwave imaging modality while also allowing for easy adaptability by doctors and hospitals. I researched the modality, developed algorithms, and developed programs to convert medical images in dielectric values to Hounsfield units, which are present in computed tomography (CT) scans, and to MRI intensity values, which are present in magnetic resonance imaging (MRI) scans. The code successfully worked for single frequencies and over a range of frequency values (using a Debye model). This allows for doctors to understand images producing using electromagnetics in readily understood CT and/or MRI formats without requiring any additional training, leading to timely and accurate medical diagnosis. The conversion method developed allows for existing medical diagnostic tools and analysis techniques to be used directly with microwave imaging. In addition, the method for conversion from an image in Hounsfield units to dielectric values and conversion from an image in dielectric values to Hounsfield units can go in both directions. Furthermore, the method for conversion from an image in dielectric values to MRI intensity values includes creating a water content map and a T1 map as an intermediary step. The patent also included a method to convert medical images in Hounsfield units to dielectric values using a frequency dependent model. Deriving dielectric models from CT scans is often useful when solving complex problems in computational electromagnetics. 

The second patent titled “Distributed Microwave Image Processing System and Method” resulted from the need to want all imaging centers, radiology groups, and/or doctor’s offices to be able to have access to images produced using electromagnetics without having to upgrade their computer hardware. A method was developed to allow for the majority of image processing and image reconstruction of microwave images to occur in a centralized computing environment. Instead of performing image processing and image reconstruction at the imaging centers, radiology groups, and/or doctor’s offices, these remote sites send the microwave data they collect to the the centralized computing environment.  The centralized computing environment also offers another distinct advantage; the data and results acquired at all the remote sites can be stored and used to enhance processing and reconstruction of microwave images. The centralized computing environment takes advantage of multiple processors to perform iterative reconstruction and seeds the reconstruction using prior data. In one embodiment of the invention, the seed is generated by first comparing collected and stored scattering fields to find a best or closest match and then using stored data of a prior reconstructed image reconstructed corresponding to the stored scattering fields of the best or closest match. In another embodiment of the invention, the seed is generated by both of (1) using the collected microwave data and (2) using stored data of a prior reconstructed image of a different patient which closely matches data of the current patient. The centralized computing environment also has the capability to convert medical images in dielectric values to Hounsfield units. The method developed and described allows for more accurate image reconstructions to occur in less time than if they were performed at remote sites.

It is exciting to work on new technology and methods that can have a real impact on the health of patients. Below are three patent certificates that were created to celebrate the accomplishment of having these three patents granted.

Todd McCollough Patents Ellumen Celadon

Years ago I had the opportunity to attend the Outback Bowl on January 1, 2010, in Tampa, Florida, at Raymond James Stadium. The Northwestern Wildcats played the Auburn Tigers in the college football bowl game.

The Tigers beat the Wildcats by a score of 38 to 35.  I took many pictures while at the game, but never released them publicly on this site at the time. Below you can find some pictures I took while at the game.

northwestern_outback_bowl_2010

northwestern_auburn_outback_bowl

northwestern_outback_bowl_halftime

northwestern_outback_bowl_field_goal

northwestern_auburn_college_football

northwestern_outback_bowl_fans

northwestern_marching_band_outback_bowl

auburn_outback_bowl_2010

northwestern_outback_bowl_tigers_wildcats

I was able to attend a college basketball game at Allstate Arena in Rosemont, Illinois, on Friday, December 1, 2017. The Northwestern University Wildcats played the University of Illinois Fighting Illini. Northwestern is playing all home games for the 2017 to 2018 basketball season at Allstate Arena while Welsh-Ryan Arena undergoes a renovation.

Northwestern beat Illinois by a score of 72 to 68 in overtime. Northwestern guard Scottie Lindsey came off the bench to score a team high 22 points. Northwestern forward Vic Law scored 16 points with seven rebounds while Northwestern guard Bryant McIntosh scored 14 points with six assists.

Orange wearing Illini fans greatly outnumbered purple wearing Wildcats fans in attendance. Several Northwestern football players and the head football coach Pat Fitzgerald were in attendance, as the players named to the All-Big Ten this season were recognized during a timeout.  The Northwestern University “Wildcat” Basketball Band and Northwestern University director of athletic bands Daniel J. Farris were also in attendance to play songs for the crowd throughout the game. Below are some pictures I took while at Allstate Arena during the Illinois vs Northwestern college basketball game.

Northwestern Illinois basketball pre-game warmup

Northwestern Illinois basketball national anthem

Northwestern Illinois basketball pre-game huddle

Northwestern Illinois basketball starting line-ups

Northwestern Illinois basketball tip off

Northwestern Illinois basketball dribbling

Northwestern Illinois basketball shooting

Northwestern Illinois basketball three pointer

Northwestern Illinois basketball all big ten football

Northwestern Illinois basketball free throw

Northwestern Illinois basketball go u cheer

Northwestern Illinois basketball band

Northwestern Wildcats Basketball

Northwestern Illinois basketball banner

Northwestern Illinois basketball layup

Northwestern Illinois basketball crowd

Northwestern Illinois basketball Allstate Arena

Northwestern Illinois basketball rebound

Northwestern Illinois basketball game winner

I was able to attend the Radiological Society of North America (RSNA) meeting at McCormick Place in Chicago, IL, which occurred from November 26 to December 1, 2017. The annual meeting is a very large gathering of industry leaders in medical imaging, radiologists, and other related industry professionals. This was the 103rd Scientific Assembly and Annual Meeting with the tagline: Explore, Invent, Transform. This year the meeting was heavily focused on topics around machine learning, virtual reality, and 3D printing. Like always, there were lots of exhibitors with many new medical imaging devices ready to discuss and provide demonstrations. There were also interesting plenary sessions, educational courses, and scientific sessions. Furthermore, there were numerous posters and presentations.

A popular feature this year at RSNA, was a deep learning classroom presented by the NVIDIA Deep Learning Institute (DLI), designed for attendees to engage with machine learning tools, write algorithms, and improve their understanding of emerging machine learning technology. In one of these sessions, attendees trained a deep neural network to recognize handwritten digits. In another session, attendees trained convolutional neural networks (CNNs) to create biomarkers to identify the genomics of a disease without the use of an invasive biopsy. In yet another session, attendees segmented magnetic resonance imaging (MRI) images to measure parts of the heart.

Another feature this year was a separate section for machine learning showcase exhibitors. This section allowed those interested in machine learning to easily network with those in the field. This section featured a machine learning theatre with presentations from industry leaders. For example, in one presentation, Google Cloud talked about machine learning in imaging and how to build your own models on the cloud. In another presentation, Siemens Healthineers discussed artificial intelligence solutions for clinical decision making by turning medical images into biomarkers to help increase effectiveness of care. There was also a 3D printing theater with many posters and actual 3D printed parts nearby. In addition, there were several virtual reality demos setup to allow attendees to try themselves.

I was able to attend many interesting courses on machine learning, radiomics, 3D printing, virtual reality, and predictive analytics. For example, in one course I attended there was discussion of how to use KNIME to incorporate radiology data sources into predictive modeling and interpret the results and make visualizations. There was an interesting talk in another course I attended about using virtual reality in medical education and how it can greatly decrease the amount of time needed to teach students when compared to PowerPoint presentations. In yet another course I attended, instructors walked attendees through using Mimics and 3-matic from Materialise. In this course participants were taught how to segment out musculoskeletal, body, neurological, and vascular systems from DICOM files into a Standard Tessellation Language (STL) file for use with a 3D printer.

I was also able to attend the plenary session by Michio Kaku titled “The Next 20 Years: How science and technology will revolutionize business, the economy, jobs, and our way of life.” In the talk Dr. Kaku discussed the next wave of wealth generation in our modern economy which he believes is advancements at the molecular level including in artificial intelligence, nanotechnology, and biotechnology linked together by the cloud. He believes that information will be everywhere and computers will become like the word electricity today, where it is not mentioned in language as it is ubiquitous. Dr. Kaku recognized robots will replace jobs in the future but said robots are weak in three areas: 1) pattern recognition, 2) common sense, and 3) human interactions. Thus he believes in many cases artificial intelligent systems will aid humans and not replace them.

Below are some of the pictures I took while at the RSNA annual meeting in 2017, in Chicago, IL.

RSNA McCormick Place

RSNA 2017 Chicago McCormick Place

Welcome RSNA 2017

RSNA South Technical Exhibits

RSNA Learning Center

RSNA Posters

RSNA Cardiac Informatics Machine Learning

RSNA Deep Learning Classroom

RSNA NVIDIA digits

RSNA Virtual Reality

RSNA 3D Printing in Medicine

RSNA 3D Printing Posters

RSNA 3D Imaging in Anatomic Pathology

RSNA 3D Printing Technology

RSNA 3D Printing Schedule

RSNA National Cancer Institute Cancer Imaging Archive

RSNA QIRR Meet the Experts

RSNA Rontgen Reimagined

RSNA Welcome 2017

RSNA Booth Sitting

RSNA Canon Toshiba Medical

RSNA Toshiba

RSNA Machine Learning Google Cloud

RSNA Carestream

RSNA ziehm imaging

RSNA FUJIFILM

RSNA HOLOGIC

RSNA General Electric GE

RSNA Samsung

RSNA HITACHI

RSNA Konica Minolta

RSNA Bayer Angiography

RSNA Siemens Healthineers

RSNA Elsevier

RSNA Philips

RSNA lifeIMAGE

RSNA Next 20 Years

RSNA Michio Kaku

RSNA Tours and Events

RSNA Technical Exhibit Map South

RSNA Technical Exhibit Map North

RSNA 2017 Chicago

RSNA Corporate Partners 2017

RSNA waterfront Chicago

RSNA 2017 Chicago Landscape