Tuesday Views 2 November 2021

Hey there!

Hi, Orion Drones wants to bring drone services to your town, your next event, your business, to you! We are passionate about photography, aeronautics, inspections, and drone service integration. We bring you news and views here on the blog roll each week on Tuesday at 8:26 am EST, 13:26 UTC. Thanks for visiting!

Verticals volunteer gig – Holiday delay – Executive housekeeping – Drone build – Studying for PE Exams – Job hunt

Community Development

How It’s Going: Update

The Verticals after-school program is partnering with student teachers from Cal U. I am really excited to be involved with so many good groups. It is a credit to William James, Team Humanity Clothing Company, and Verticals. I went to the Boys Clubs of America. I learned to code there, in 5th and 6th grade. I am lucky to be able to give back. A link to the group is provided. So, now we have high school and college students learning together. Students taking the jump to a career in teaching get to meet some curious young minds.

Topics in the program include artificial intelligence (AI), robotics, drones, digital fabrication, coding, media production, design studio, blockchain, and crypto are presented alongside sports events and fitness resources. Verticals Youth Empowerment Center is designed to stimulate young minds, give parents an inclusive environment in which to socialize, and to offer youth a sense of pride in themselves and their community. I am really excited to be a part of the team.

Peak leaves

I managed to get a few shots of Pennsylvania yesterday. I will try again today. The weather is unseasonably fair. Right after the leaves peak, we get a lot of rain that debrides the tree of its autumnal regalia. I can share these photos of the Lane Blane Bridge with the fall spectrum in the background.

Near-Infrared Remote Sensing part 1 of 2

Shininess: I like shiny things. If I was creating an orthographic mosaic of images to reconstruct a object with a digital twin, I would be faced with cost. I want the most accurate model I can afford, computationally speaking, yet I know an image is subject to inaccuracies due to noise, gamma corrections, blurs, and exposure variatoon. If you have ever seen Google Earth 3D reconstructions, you might have wondered to yourself why those trees and cars and buildings look sort of deformed. ML is used to correct for those inaccuracies using synthetic training libraries (people are wild about libraries, amirite?) The range of affordable angles is limited. The number of images is limited. The the nodes from object to sensor is theoretically unlimited, but utility constrained by cost. The less the better. Mapping or modelling objects at night is a challenge. No one: I need oodles of time training a computer to recognize elements of features, such as edge detection, range or types of motions over time in a limited space, albedo values, Gausses and devising tuned bounding boxes for the objects, working those changelogs, with arbitrary color assignments or whatever for scheduling purposes, Why not create a synthetic library? Think of it as the Cliff Notes of 3D fabrication.

Enter the nanodrone. It is tiny and cool, fits in your pocket, and cheap. How do you put an image classifier on a tiny little drone? How do you make a smarter nanodrone? Why, you make it ignore stuff, of course. Retroreflectivity is useful in indoor environments, but useless outdoors at night. Filming in darkness is useless if you are using optical cameras, but what about RFID classifiers? What about environments cluttered with RF and evolutionary signals? Get your mortarboard fitting here. I wanted to simply write an unfair comparison of retroreflection vs luminance when paired in sensor fusion with perovskite PeLEDs, but I fell in a rabbit hole. That means one should expect a return to this topic next week.

It’s not necessarily all about lighting

I wanted to write about reasons we use bright lights on drones at night, but it seems like a no-brainer. If you disagree, close your eyes and walk to New Jersey. If you are already in New Jersey, you should just keep walking. I am kidding! Also, without opening your eyes, note the exact time and heading of your arrival (eyes still closed). For those of you who are blind, I would not recommend you try this same experiment in a car to level the playing field, because a car…might level more things than the field. My point is that

Measurements for plant growth density, or health indicators represented by variation in wavelength could be degraded or omitted by computational errors, unintentionally aplanatic lenses without error correcting filters, and so on. You want the smoothest mirrors possible, but you get smog. . A normalized difference vegetative index, (near-infrared radiation at 0.7 to 1.0 μm minus visual radiation, divided by near-infrared + visual radiation) gives you the NDVI pixel rating for the lighting condition, but your butter beans are suffering from an unknown invasive mold. You have tried everything, but maybe it’s time to augment the reality of a computational system with a reduction in input. How to make a computer sense things like humans do? Between visible human sight and silicon response lies near-infrared sensors and detection strategies. I want to expand on this topic with industry news next week. This week, I am going to ramble into the subject.

The formula always a result between -1 and +1. A +1 is the absolute highest available concentration of green leaves, and a zero means no vegetation. There is a direct relationship between photosynthesis and “greenness” of a plant.

ML-Assisted Object Recognition Applications

Over time, calibrating what constitutes desirable ranges in acceptable growth can be calibrated from a historicity of values, average lowest and highest readings over many years or cycles of growth. As always, clouds and aerosols will skew readings, and that is an opportunity for drones to address. Accurate forecasts for the emission, egress, and absorption of gases, or colloidal suspensions, is within the domain of drone applications to solve. These forecasts could even prepare aircraft for travel through temperature inversions. This technology could be used to monitor coastal marine vegetation, or possibly even find a shipwreck in shallow water. Lake Eerie, possibly taking its name from the Iroquois word for “long-tailed cat” ereilhonan, is home to one of the most concentrated location of shipwrecks in the world. The Western end of the Great Lake the most shallow area, and it is strewn with underwater features that cause boats to capsize and sink. There are hundreds of shipwrecks there.

Looking ahead, the technology established in cubesat imagery today will help train the James Webb Space Telescope to find earth analogs in other galaxies using near-infrared data, the electromagnetic interzone between silicon and human visual response. The future looks sort of rosy there. The JWST is expected to increase our reach into the universe tenfold when it launches in December 2021. Using neural networks, it is possible to identify, and classify geomorphological structures based on histrionics or eigenshapes, generatively, stuff like indications of flowing liquid, or crystalline deposits, veins of ore, the fossils of remotely extinct living things. Flying over old trojans locked in Jupiter’s orbit, or in the Ooort cloud, one might find rare and wondrous things to drag back to earth, or use as a waypoint for interstellar exploration...I’m going to need to have you come in on Saturday and build that warp drive, m’kay?


Identifying exoplanet geomorphologies based on different planetary composition, weather and evolution requires an autonomous robotic classifier that can learn from millions of images to detect fractal patterns of evaporation, liquid movement or other graphic effects indicating seismic activity.


Deploying near-infrared cameras precisely positioned within arrays deep into interstellar space could produce quantum measurements of gravity waves, which, in turn, could inform us what is in the path of our solar system as it hurtles around the Milky Way in its screwy orbit. I think we might be going a spitball close to Mach 7 in a Galactic Orbit, possibly in a magnetic, solitonic waveguided tunnel, an ancient path.

Spectroscopy allows scientists to determine material or field behavior based upon these measurements.

Medicine and Biology

In medicine, this plays out with real-time neuroimaging, for example, or oxygenation in microvascular systems, valuable tools for non-destructive exploration of organisms, through endoscopy, or remotely monitoring conditions affect by internal or external dynamic, whether it be metabolism, respiration, egress to and from resources, immunology, or response to environmental hazards. The key for all of these information lockboxes is in ML algorithmic keychains.

Machine Learning

A dynamic approach is the key. Through supervised or unsupervised or hybrid learning, a feature recognition system becomes an exportable registry ready for validation upon unknown data. Here again, the function(s) of machine learning involves a paraconsistent driver, a lambda calculus of hyperparameters that occasionally hitchhike on review by a data scientist who defines boundary conditions for coholomorphic infinite distributions within parametric space. The universe is approaching heat death. , disregarding or overpopulating outliers, Just imagine how much machine learning has improved data analysis for photovoltaic arrays in subatomic particle accelerators.

LEDs on drones at night need to be bright and lightweight. Making LEDs with stronger candela ratings will bring drones into visual compliance more often, so scientists are testing and using alternatives to regular old LEDs just like scientists test new forms of batteries to one day replace Lipos. Those suckers are HEAVY!

Studying for P E Exams


  • Studying for computer science and electronics licensure, then I go after a mechanical engineering exam. This could take a while.

Job Hunt

  • Continuing to grow audience for my quaint 2018 thesis on medical drone service integration, still netting the ‘top award’ for reads of material published by Cal U students on Researchgate.net.
  • In contact with recruiters, willing to adapt to new challenges. I technically have two jobs, but the combined income is still well below the poverty line.

New Orion Drones posts in your inbox

About Me

I became a drone technology service integration researcher in 2017 while in university. When I am not working on drones, I am researching related scientific reports and industry news, flying drones, or tinkering with electronics. Contact me!

%d bloggers like this: