Search

Paul Schmirler Phones & Addresses

  • 7421 Chadwick Rd, Milwaukee, WI 53217 (414) 247-3755
  • Glendale, WI
  • 7421 N Chadwick Rd, Milwaukee, WI 53217 (414) 704-3824

Work

Company: Rockwell automation Position: Senior software engineer

Industries

Computer Software

Resumes

Resumes

Paul Schmirler Photo 1

Senior Software Engineer

View page
Location:
Milwaukee, WI
Industry:
Computer Software
Work:
Rockwell Automation
Senior Software Engineer

Publications

Us Patents

Methods, Systems, Apparatuses, And Techniques For Employing Augmented Reality And Virtual Reality

View page
US Patent:
20220253129, Aug 11, 2022
Filed:
Apr 27, 2022
Appl. No.:
17/660928
Inventors:
- Mayfield Heights OH, US
John M. Van Hecke - Hartland WI, US
Alex L. Nicoll - Brookfield WI, US
Paul D. Schmirler - Glendale WI, US
Mark Bjerke - Waukesha WI, US
International Classification:
G06F 3/01
G06T 11/60
G09B 9/00
G09B 19/00
G06F 3/03
G06T 11/00
Abstract:
Techniques for employing augmented reality or virtual reality information are presented. An information management component (IMC) of an augmented reality device (ARD) can monitor and detect user activities and conditions in area in proximity to ARD. Based on user activities and conditions, IMC can determine augmented reality information that can enhance user experience, performance of user activities, or security and safety of user. IMC can present, via an interface component of ARD, the augmented reality information to the user. The augmented reality information can relate to user location; navigation by the user; tasks to be performed by the user; product assembly; maintenance work; system or product design or configuration; remote control of assembly, maintenance, design, or configuration; environmental and/or hazardous conditions; security, identification, and authentication of users; or training the user to perform tasks. IMC can translate information from a language to a different language of the user.

Systems And Methods For Providing Context-Based Data For An Industrial Automation System

View page
US Patent:
20230089251, Mar 23, 2023
Filed:
Sep 22, 2021
Appl. No.:
17/482153
Inventors:
- Mayfield Heights OH, US
Paul D. Schmirler - Glendale WI, US
Timothy T. Duffy - Franklin WI, US
Kristopher J. Holley - Mequon WI, US
Susan J. Lovas - Sussex WI, US
International Classification:
G06K 9/00
G05B 19/048
G06T 19/00
Abstract:
A non-transitory computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive, from first sensors, first sensory datasets associated with an industrial automation system, receive, from second sensors, second sensory datasets associated with a machine configured to perform mechanical operations, determine a position of the machine relative to the industrial automation system based on the first sensory datasets and the second sensory datasets, determine output representative data associated with the industrial automation system based on the first sensory datasets and the second sensory datasets and in accordance with the position of the machine relative to the industrial automation system, instruct an extended reality device to present the output representative data, determine movement of components of the machine, and instruct the extended reality device to present feedback based on the movement of the components.

Augmented Reality Interaction Techniques

View page
US Patent:
20230091359, Mar 23, 2023
Filed:
Nov 21, 2022
Appl. No.:
17/991586
Inventors:
- Milwaukee OH, US
Paul D. Schmirler - Glendale WI, US
Timothy T. Duffy - Franklin WI, US
International Classification:
G06F 3/01
G06F 3/04815
G06F 3/04842
G06T 13/20
G06T 19/00
G06F 1/16
G06F 3/04845
G06F 3/0482
Abstract:
A method may include receiving, via a processor, image data associated with a user's surrounding and generating, via the processor, a visualization that may include a virtual industrial automation device. The virtual industrial automation device may depict a virtual object within image data, and the virtual object may correspond to a physical industrial automation device. The method may include displaying, via the processor, the visualization via an electronic display and detecting, via the processor, a gesture in image data that may include the user's surrounding and the visualization. The gesture may be indicative of a request to move the virtual industrial automation device. The method may include tracking, via the processor, a user's movement, generating, via the processor, a visualization that may include an animation of the virtual industrial automation device moving based on the user's movement, and displaying, via the processor, the visualization via the electronic display.

Systems And Methods For Modifying Context-Based Data Provided For An Industrial Automation System

View page
US Patent:
20230091778, Mar 23, 2023
Filed:
Sep 22, 2021
Appl. No.:
17/482145
Inventors:
- Mayfield Heights OH, US
Paul D. Schmirler - Glendale WI, US
Timothy T. Duffy - Franklin WI, US
Kristopher J. Holley - Mequon WI, US
Susan J. Lovas - Sussex WI, US
International Classification:
G06T 11/00
G06N 5/04
G06N 5/02
G06F 3/01
Abstract:
A tangible, non-transitory, computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive sensory datasets associated with an industrial automation system, determine context information based on a sensory dataset and representative of an environmental condition, predict an intent of a user to complete a task associated with the industrial automation system based on the sensory datasets and the context information, present first output representative data via an extended reality device based on the intent and a setting, the setting including a data presentation format for presenting the sensory datasets, receive inputs indicative of changes to the data presentation format, present second output representative data via the extended reality device in response to receiving the inputs, and update the setting based on the inputs and historical data indicative of users changing the data presentation format of the first output representative data.

Systems And Methods For Providing Context-Based Data For An Industrial Automation System

View page
US Patent:
20230092405, Mar 23, 2023
Filed:
Sep 22, 2021
Appl. No.:
17/482165
Inventors:
- Mayfield Heights OH, US
Paul D. Schmirler - Glendale WI, US
Timothy T. Duffy - Franklin WI, US
Kristopher J. Holley - Mequon WI, US
Susan J. Lovas - Sussex WI, US
International Classification:
G05B 19/418
G06F 16/23
G06F 3/0482
G06T 11/00
G06F 3/01
Abstract:
A tangible, non-transitory, computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive user input indicative of a selection of a user experience of a plurality of user experiences. The plurality of user experiences include a first user experience associated with a first event that occurred in an industrial automation system at a first time prior to receiving the user input and a second user experience associated with a second event expected to occur in the industrial automation system at a second time after receiving the user input. When executed, the instructions also cause the processing circuitry to determine, based on the user input, output representative data associated with the industrial automation system and instruct an extended reality device to present the output representative data.

Systems And Methods For Providing Context-Based Data For An Industrial Automation System

View page
US Patent:
20230092938, Mar 23, 2023
Filed:
Sep 22, 2021
Appl. No.:
17/482205
Inventors:
- Mayfield Heights OH, US
Paul D. Schmirler - Glendale WI, US
Timothy T. Duffy - Franklin WI, US
Kristopher J. Holley - Mequon WI, US
Susan J. Lovas - Sussex WI, US
International Classification:
G05B 19/418
Abstract:
A non-transitory computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive sensory datasets associated with an industrial automation system from sensors, receive positioning data via an extended reality device associated with a user, determine a first virtual positioning of the user in a virtual coordinate system based on the positioning data, determine a second virtual positioning of an industrial automation system in the virtual coordinate system based on the sensory datasets, determine output representative data to be presented by the extended reality device based on the plurality of sensory datasets and in accordance to the first virtual positioning relative to the second virtual positioning, and instruct the extended reality device to present the output representative data.

Systems And Methods For Providing Context-Based Data For An Industrial Automation System

View page
US Patent:
20230093660, Mar 23, 2023
Filed:
Sep 22, 2021
Appl. No.:
17/482097
Inventors:
- Mayfield Heights OH, US
Paul D. Schmirler - Glendale WI, US
Timothy T. Duffy - Franklin WI, US
Kristopher J. Holley - Mequon WI, US
Susan J. Lovas - Sussex WI, US
International Classification:
G06T 11/00
G06F 16/28
G06F 3/01
Abstract:
A tangible, non-transitory, computer-readable medium includes instructions. The instructions, when executed by processing circuitry, are configured to cause the processing circuitry to receive a plurality of sensory datasets associated with an industrial automation system from a plurality of sensors, categorize each sensory dataset of the plurality of sensory datasets into one or more sensory dataset categories of a plurality of sensory dataset categories, determine context information associated with the plurality of sensory datasets, the context information being representative of an environmental condition associated with an extended reality device, the industrial automation system, or both, determine a priority of each sensor dataset category of the plurality of sensory dataset categories based on the context information, determine output representative data to be presented by the extended reality device based on the plurality of sensory datasets and the priority, and instruct the extended reality device to present the output representative data.

Digital Twin Outcome-Driven Orchestration

View page
US Patent:
20230082099, Mar 16, 2023
Filed:
Sep 10, 2021
Appl. No.:
17/471967
Inventors:
- Mayfield Heights OH, US
Thong Nguyen - Milwaukee WI, US
Paul Schmirler - Glendale WI, US
Jingbo Liu - Mequon WI, US
International Classification:
G05B 19/418
Abstract:
Various embodiments of the present technology relate to digital twins of devices and assemblies. More specifically, some embodiments relate to the orchestration of digital twin models for representing industrial systems based on characteristics of digital twins. In an embodiment, a method of operating an orchestration engine in an industrial automation environment comprises identifying a targeted outcome for modeling the industrial automation environment, configuring a digital twin environment corresponding to the industrial automation environment based at least on the targeted outcome, and executing a process associated with the industrial automation environment using the digital twin environment.
Paul D Schmirler from Glendale, WI, age ~57 Get Report