In Broadcast - May 2018 - 56
www.inbroadcast.com | Vol: 8 - Issue 5 | May 2018
NAB Debut for Telemetrics'
Telemetrics made a number of new announcements at NAB 2018 including its
new OmniGlide Roving Platform and software enhancements for its RCCP line...
OmniGlide Roving Platform
Roving Platform has been labelled
as the future of automated studio
camera operation by the company. It
features a three-point contact design
for tripod-like stability, while allowing
the pedestal to turn more sharply and
smoothly due to the advanced omnidirectional electro-mechanical servomotors integrated into the pedestal
The pedestal has three modes of
operation - Robotic Control, Manual
Crabbing, and Manual Free Roam -
and utilises advanced software and
XY sensors to aid in its unmanned
The OmniGlide Platform's internal
servo control system governs how
smoothly the pedestal moves.
An independent drive design is
programmed to auto-compensate for
anomalies in the floor and maintain a
constant speed that is undetectable
to the viewer at home. If it encounters
less than perfect operation, the system
will immediately compensate by
sending more power to the wheels to
maintain smooth motion.
With the XY scanners, the roving
pedestal can "learn" the room's
physical details and auto-correct for
its position and orientation, avoiding
intervention and ensuring crew safety.
An optional feature offers users the
ability to leverage artificial intelligence
algorithms that allow the pedestal to
analyse its surrounding environment
and correct itself automatically by
learning the studio parameters.
"Using three rugged spherical balls
instead of traditional wheels found
on other products on the market,
the new OmniGlide Roving Platform
takes robotic studio camera
operation to a whole new level,"
said Michael Cuomo, Vice President
The OmniGlide Platform also
Correction," which allows the
roving pedestal to self-correct
its own orientation, using XY
data from the scanners. Optional
infrared cameras can be integrated
into the pedestal to provide absolute
positioning by pointing at strategically
placed reflectors on the studio ceiling,
enabling precise and repeatable
camera moves to be performed
with ease. The platform can also be
operated manually, if required, by
using its "Motion Assist" technology.
Robotic Camera Control Panels
Telemetrics also announced new
software features for its Robotic
Camera Control Panels (RCCP) that
streamline system configuration,
including improvements to the RCCP1A-STS studio software and the RCCP1A-LGS legislative software.
At NAB 2018, the RCCP-1A-STS
and RCCP-1A-LGS control panels
were showcased with Telemetrics'
reFrame Automatic Shot Correction
technology. This helps users of
automated news studios and large
robotically controlled multi-camera
venues to overcome unpredictable
occurrences, such as the desired oncamera subject slightly moving out of
frame, and to make quick adjustments
The new reFrame software uses
built-in artificial intelligence to analyse
the video from the robotic cameras
and facial recognition algorithms to
lock the cameras onto the talent, and
automatically trims the shot without
the operator having to touch the
controls. Multiple show settings can
be saved internally, each with its own
unique inventory of shots that can be
recalled at a moment's notice and
sent to air instantly when required. No
separate hardware is needed as it is all
built into the RCCP control panels.
"The reaction to reFrame has been
overwhelmingly positive from all
that have seen it because it allows
operators to focus on the creative
parts of the show or live proceeding,"
continued Cuomo. "For example,
operators of automated studios don't
want the hassle of having to keep their
eye on every camera while attending
to other parts of the newscast.
reFrame offers them the security that
pre-planned shots are all trimmed
correctly, each and every time."
The operational simplicity of
Telemetrics' RCCP-M control panel
has also been increased, with the
introduction of a new Perspective
View feature. This lets users take a
picture of a desired angle or seat in a
room with a mobile phone, load it into
the control panel and then remotely
operate the camera based on that
view. The push of a button associated
with that stored image as a "hot spot"
then automatically commands the
camera to take that perspective.
All Weather PT-WP-S5
Addressing the need for 24/7
unmanned camera positions in hard
to reach or impractical environments,
the company launched the PT-WP-S5
- an all-weather robotic pan tilt
head and camera housing system.
The new temperature-controlled PTWP-S5 pan/tilt and WP-HOUL-S5
metal housing is suited for TV
station TraffiCam systems, outdoor
racetracks and stadiums, as well as
high-level security applications.
payloads and fast, quiet on-air
operation. The system supports
large and small cameras (with a
load capacity of up to 90lbs), and
a variety of digital and analogue
lenses. It also provides highly
accurate communication control
OmniGlide Roving Pedestal
(Direct IP, RS232, or 422 Control)
for direct and redundant camera
control paint and shade activities.
Providing power and control signals
over a single cable, the WP-HOUL-S5
includes a sunscreen, heating system
that reduces icing and fogging of the
glass windshield and a wiper blade.
An optional washer is also available
to keep the camera's view clean.
The system includes an integrated
mechanism to support Keyframe
motion and on-the-fly motion time
AR Graphics Demonstration
Augmented Reality is also a focus
for Telemetrics and at NAB 2018, the
company teamed up with Vizrt for
an AR graphics demonstration. The
demo featured the Telemetrics PTLP-S5 servo-controlled PTZ camera
head synchronised with the Viz Virtual
Studio rendering engine and its builtin multi-layer compositor to render
interacting with a live video feed.
The Virtual Interface on the PT-LP-S5
utilises extremely high resolution
encoders, which provide positioning
data with an accuracy of 0.00045
degrees. When used on a Telemetrics
TG-4 track system with the EP7
Televator, the camera position can be
recalled to within one ten-thousandth
of an inch anywhere along a track.
This real-time positioning data is then
used by the Tracking Hub to match the
camera position with the background,
and the virtual elements are rendered
onto the scene by the Viz Virtual