This application can be used to augment reality by combining a simulated
light-painting mission with the actual footage of a viewing mission. Two
missions are required.
Light-Painting Mission
This is a mission that draws a recognizable object in the sky. While
any mission can be used, the best results will be obtained when using a
true light-painting mission
Use these instructions to create a light-painting mission if you do not already have one.
A "good" light-painting mission will have many waypoints - perhaps 100
or more.
Once a suitable light-painting mission has been created, save it in
Litchi's Mission Hub. Export this mission as a CSV file named something
like "painting.csv". The exact GPS location of this mission may need to
be adjusted and re-exported later.
Viewing Mission
This mission will be used to view the progress of the light-painting
mission.
To create this mission, one might place a POI centered at the location
of the light-painting mission at a height of roughly one half that of
the light-painting mission. In the example provided, "Spiral Mission
Creator" was used to create a spiral around the location of the
light-painting mission.
A "good" viewing mission will also have many waypoints - perhaps 100 or
at least a number of curved turns.
Using Virtual Litchi Mission, export this mission to Google Earth Pro
and then right-click on the mission in Google Earth Pro and save it
(Save Place As...) as either a KML or KMZ file. Name it something like
"view.kmz" or "view.kml".
IMPORTANT: Prior to exporting the virtual mission from Google Earth Pro, make sure your field of view (FOV) has been properly selected. For details see Litchi's User Guide in the Mission Hub section (bottom of page).
In page one of "Light-Painting Simulator":
Select the desired line thickness of the light-painted line to be simulated. The default is a good starting value.
Select the desired color of the light-painted line to be simulated. Depending on the background of the actual footage, the default might be a good starting value.
The "End Padding" allows the last frame of the light-painting to be frozen for a while as the viewing mission completes. If this is unclear, leave it set to zero for your first trial.
Select the light-painting mission CSV file.
Select the viewing KML file exported from Google Earth Pro.
Generate the light-painting simulation.
Create a black image (JPG or PNG) using whatever software you have available.
Navigate to the light-painting mission area, orient the view to be looking straight down, and zoom out.
Use the "Add Image Overlay" tool to import the black image created earlier.
Open "Add Image Overlay" tool.
Browse to find the black.png or black.jpg file created earlier.
Make sure the "Altitude" is "Clamped to ground".
Grab the corners of the image overlay and drag them way out to cover any nearby terrain.
Click "Ok"
Drag the KML file downloaded from the "Light Painting Simulator" into Google Earth Pro.
Google Earth Pro should orient the view to show a completely black display.
Turn off all layers except for "Terrain".
In the View menu uncheck "Atmosphere".
From the Tools menu select "Movie Maker".
Record from "A saved tour:" select the dropdown list and choose "Run Virtual Light Painting...".
Note the location of the "Save to" setting and change if necessary.
Set the "Picture size (pixels)" appropriately.
Set the "Frames per second" appropriately.
Select "Create Movie". It may take a few minutes to generate the light-painting movie.
If you notice the display of any terrain, return to the "Add Image Overlay" step, modify the overlay to cover more terrain.
In Davinci Resolve: (I use Davinci Resolve. If you use different editing software, it probably has similar functions.)
On the "Media" page, add both the actual viewing footage and the Google Earth Pro movie.
On the "Edit" page, drag both clips into the timeline. Place the Google Earth Pro movie on top of the actual footage.
Select the Google Earth Pro clip.
In the "Inspector" window, change the "Composite Mode" to "Add".
Adjust the starting points of the clips relative to each other as appropriate.
The remaining steps depend upon your workflow.
Restore Settings in Google Earth Pro
From the "View" menu, enable "Atmosphere".
In the "Layers" window pane, enable "Borders and Labels, Roads, 3D Buildings, as desired.
This utility can be used to augment reality by combining a simulated light
painting mission with the actual footage of a viewing mission creating an
interesting combination of real and synthetic drone footage. Two missions
are required:
Light-Painting Mission
The light-painting mission is one that is used to draw the shape of
a recognizable two or three-dimensional object in the sky.
This mission does not have to be flown. Instead, it will be simulated.
All that is required is the Litchi CSV file of the mission.
Viewing Mission
The viewing mission is one that will be used to view the progress of
the light-painting mission. This mission will need to be flown with
a drone to capture the actual footage.
Virtual Litchi Mission data will be combined with light-painting mission
data to create a simulation video that can be composited on top of
actual drone footage.
Perhaps the best way to understand what can be accomplished with this process
is to view the example video displayed to the right. The following tools will
be required:
A drone
Two Litchi Missions (painting and viewing)
Virtual Litchi Mission
Google Earth Pro
Davinci Resolve (or equivalent)
This web application and its instructions.
If you would like any help getting this to work, please let me know.