T 1986/13 () of 20.12.2017

European Case Law Identifier: ECLI:EP:BA:2017:T198613.20171220
Date of decision: 20 December 2017
Case number: T 1986/13
Application number: 08726548.4
IPC class: G05D 1/00
G08G 5/00
Language of proceedings: EN
Distribution: D
Download and more information:
Decision text in EN (PDF, 305 KB)
Documentation of the appeal procedure can be found in the Register
Bibliographic information is available in: EN
Versions: Unpublished
Title of application: AUGMENTED REALITY-BASED SYSTEM AND METHOD PROVIDING STATUS AND CONTROL OF UNMANNED VEHICLES
Applicant name: Exelis Inc.
Opponent name: -
Board: 3.5.03
Headnote: -
Relevant legal provisions:
European Patent Convention Art 56
Keywords: Inventive step - after amendment (yes)
Catchwords:

-

Cited decisions:
-
Citing decisions:
-

Summary of Facts and Submissions

I. This appeal is against the decision of the examining division refusing European patent application No. 08726548.4 with international publication No. WO 2008/112148 A1. The refusal was based on the grounds that the subject-matter of independent claims 1 and 4 lacked novelty (Article 52(1) and 54 EPC) and that the independent claims did not comply with Article 84 EPC.

II. With the statement of grounds of appeal the appellant filed sets of claims of a main request and an auxiliary request. Oral proceedings were conditionally requested.

III. In a communication accompanying the summons to oral proceedings, the board addressed points to be discussed in the oral proceedings and gave a preliminary opinion regarding inventive step (Article 56 EPC).

IV. With a letter dated 1 December 2017 the appellant filed further sets of claims in the form of a second and a third auxiliary request.

V. Oral proceedings before the board were held on 20 December 2017.

In the course of the oral proceedings the appellant replaced all requests on file by a single set of claims 1 to 4 ("Main Request").

The appellant requested that the decision under appeal be set aside and that a patent be granted on the basis of the claims as filed during the oral proceedings.

At the end of the oral proceedings the chairman announced the board's decision.

VI. The following documents are referred to in this decision:

D1: M. Koeda et al: "Annotation-Based Rescue Assistance System for Teleoperated Unmanned Helicopter with Wearable Augmented Reality Environment", Proceedings of the 2005 IEEE International Workshop on Safety, Security and Rescue Robotics, Kobe, Japan, June 6-9, 2005, pages 120 to 124;

D2: US 2006/0241792 A1; and

D3: WO 99/05580 A2.

VII. Independent claim 1 reads as follows:

"A method of identifying and controlling an unmanned vehicle (100) located within an environment (108), so as to navigate the unmanned vehicle through a three-dimensional urban environment, the method comprising the steps of:

receiving information from one or more sensors (116) coupled to the unmanned vehicle (100), where the information includes sensor location information and status information about the unmanned vehicle (100);

obtaining viewpoint information corresponding to a real-time view of said environment (108) from a perspective of a display (124) remote from the unmanned vehicle (100), the real-time view comprising the unmanned vehicle (100) in the context of the environment (108) and being either a direct view of the unmanned vehicle (100) itself or a video of the unmanned vehicle (100);

generating graphics using said sensor location information and viewpoint information, wherein the graphics include visual representations of controls to control the unmanned vehicle (100) and the status information;

displaying the generated graphics on the display (124) remote from the unmanned vehicle (100) such that the graphics are superimposed on the real-time view, wherein the graphics appear attached to the unmanned vehicle (100) in the real-time view; and

activating one of the displayed controls, wherein the displayed controls include:

arrows (500) that upon being selected move the unmanned vehicle (100) in the direction selected; or

an icon next to the unmanned vehicle (100) that upon being moved to a location moves the unmanned vehicle (100) to a corresponding location; or

an action button (706) next to the unmanned vehicle (100) that upon being activated commands the unmanned vehicle (100) to move to a predetermined location."

Independent claim 3 reads as follows:

"An augmented reality system for identifying and controlling an unmanned vehicle (100) located within an environment (108), so as to navigate the unmanned vehicle through a three-dimensional urban environment, the augmented reality system comprising:

one or more sensors coupled to the unmanned vehicle (100);

a tracking system that obtains viewpoint information corresponding to a real-time view of said environment (108) from a perspective of a display (124) remote from the unmanned vehicle (100), the real-time view comprising the unmanned vehicle (100) in the context of the environment (108) and being either a direct view of the unmanned vehicle (100) itself or a video of the unmanned vehicle (100);

a processing system that receives information from said one or more sensors (116), where the information includes sensor location information and status information about the unmanned vehicle (100), and generates graphics using said sensor location information and said viewpoint information, wherein the graphics include visual representations of controls to control the unmanned vehicle (100) and status information;

the display (124) that displays [sic] the generated graphics remote from the unmanned vehicle (100) such that the graphics are superimposed on the real-time view, wherein the graphics appear attached to the unmanned vehicle (100); and

an interaction device that activates one of the displayed controls; wherein the displayed controls include:

arrows (500) that upon being selected move the unmanned vehicle (100) in the direction selected; or

an icon next to the unmanned vehicle (100) that upon being moved to a location moves the vehicle to a corresponding location; or

an action button (706) next to the unmanned vehicle (100) that upon being activated commands the unmanned vehicle (100) to move to a predetermined location."

Reasons for the Decision

1. Claims - amendments (Article 123(2) EPC

Claim 1 is based on claims 1 and 4 to 6 as originally filed and passages from paragraphs [0003] (specification of the environment) and [0019], [0026] and [0030] regarding the feature of obtaining viewpoint information and the specification of the real-time view.

Claim 3 is based on claim 15 as filed and the further passages cited above.

The features of claims 2 and 4 are based on claim 2 as originally filed.

The amendments therefore comply with Article 123(2) EPC.

2. Article 84 EPC

As regards the ground for refusal pursuant to Article 84 EPC, it was argued in the impugned decision that the claims were missing "essential features" (cf. points 2.1 and 2.2 of the reasons), i.e. by not specifying in claim 1 that the controlled apparatus is a remotely controlled unmanned vehicle and the unmanned vehicle is remotely controlled by a user. This objection ceased to exist with the claims as presently worded (see point VII above).

The board is therefore satisfied that the claims comply with Article 84 EPC.

3. Claim 1 - inventive step (Article 56 EPC)

3.1 D1 relates to a method of identifying and controlling a teleoperated unmanned helicopter (cf. Fig. 4) to fly through a three-dimensional environment ("HEIJY Palace Site", cf. Fig. 7) and is therefore in the same technical field as the application. D1 discloses that various sensors ("GPS", "Gyroscope", see the system diagram in Fig. 3) are arranged with the helicopter and that information from one or more sensors, including sensor location information (i.e. from the GPS sensor) and status information (e.g. the orientation of the helicopter indicated by the gyroscope) is received at an operator station. It is further disclosed that a map of the environment through which the helicopter is navigated is displayed to the user (cf. the map shown in the lower-left portion of each image in figs. 11(a) to 11(f)). This implies that, using the wording of the application, viewpoint information corresponding to a view of the environment is received. The helicopter is indicated on the map by an arrow inserted at the corresponding location in the map. The arrow is oriented according to the helicopter's present real orientation. D1 therefore discloses that graphics are generated using the location and orientation of the helicopter on the map as status information. These graphics are displayed on a display remote from the helicopter and are superimposed on the real-time view of an image taken by a camera mounted at the helicopter (i.e. the views depicted in figs. 11(a) to 11(f)).

3.2 The method according to claim 1 differs from that disclosed in D1 by the following features:

the viewpoint information obtained corresponds to a real-time view of the environment from a perspective of a display remote from the unmanned vehicle, the real-time view comprising the unmanned vehicle in the context of the environment and being either a direct view of the unmanned vehicle itself or a video of the unmanned vehicle,

the graphics also include visual representations of controls to control the unmanned vehicle,

the graphics appear attached to the unmanned vehicle in the real-time view,

one of the displayed controls is activated; and

the displayed controls include:

arrows that upon being selected move the unmanned vehicle in the direction selected or an icon next to the unmanned vehicle that upon being moved to a location moves the unmanned vehicle to a corresponding location or an action button next to the unmanned vehicle that upon being activated commands the unmanned vehicle to move to a predetermined location.

3.3 In D1, the environment and the helicopter do not need to be directly visible to the person exercising the control and the controlling person exercises control on the basis of a view as if he/she were located inside the helicopter.

Therefore, the objective technical problem starting out from D1 can be formulated as to provide an alternative method for control of the vehicle when it is visible from the viewpoint of the controlling person.

3.4 D1 itself does not provide any hint to the solution as claimed. D1 does not suggest other possibilities for controlling the vehicle other than providing to the controlling person a view of the environment from the position of the vehicle and a map as an aid for orientation. Therefore, the claimed method is not rendered obvious to the skilled person having regard to D1 only.

3.5 D2 discloses a method in which an image of a device is captured and displayed to a person (see Fig. 1). D2 is in particular about controlling stationary installation equipment such as a manufacturing robot which is installed at a fixed place (cf. paragraph [0004]). The device is recognized on the basis of an ID which is conceptionally attached to it. An augmented reality view of the device is presented to the person by superimposing supplemental information about the device on the captured image as a computer-generated image (cf. paragraphs [0042] to [46]).

Starting out from D1, there is no hint to the skilled person to consider a combination with the control method described in D2, since the former is for controlling a vehicle which changes its position and may be outside of the sight of the control person, whereas the latter is specifically for controlling stationary equipment in sight of the control person. The board is therefore of the view that the skilled person starting out from D1 and faced with the problem as formulated above would not consider D2 without the benefit of hindsight. Therefore, the skilled person would not arrive at the method as claimed having regard to D1 and D2 in combination.

3.6 Further, D3 discloses a method of operating an unmanned transport vehicle ("F", see the abstract) in which visual representations of controls for controlling the vehicle are embedded in a view of the environment as seen from the vehicle itself as graphics (cf. Fig. 7 and the first paragraph at page 35 of the description). Therefore, combining D1 with D3 does not result in the step of obtaining viewpoint information from a perspective of a display remote from the vehicle and comprising the vehicle in the context of the environment.

Therefore, the skilled person would not arrive at the method as claimed having regard to D1 and D3 in combination.

3.7 Finally, starting out from D2, the skilled person would not arrive at applying the method of D2 to non-stationary equipment without the benefit of hindsight for the same reasons as set out at point 3.5 above.

Therefore, the skilled person would not arrive at the method as claimed having regard to D2.

3.8 Since the method is not rendered obvious by the available prior art represented by D1, D2 and D3, taken alone or in combination, it follows that the method of claim 1 meets the requirement of Article 56 EPC.

4. The above considerations apply, mutatis mutandis, to the augmented reality system of claim 3.

5. Claims 2 and 4 further limit the subject-matter of claims 1 and 3, respectively. Therefore, the subject-matters of claims 2 and 4 meet the requirement of Article 56 EPC for the same reasons as set out above.

Order

For these reasons it is decided that:

6. The decision under appeal is set aside.

7. The case is remitted to the Examining Division with the order to grant a patent on the basis of the claims 1 to 4 of the Main Request as submitted during the oral proceedings and a description to be adapted.

Quick Navigation