2017 Novembre, 30 Giovedì 18:32


During the open day, the artificial intelligence research group usually exposes the planner developed. Usually the planner solves some PDDL problems, like rovers or blockworlds.

It would be awesome to have a GUI (made in Unity or in Unreal Engine) capable of showing the plan in action.

 PDDL is a language expressing AI planning problems. A simple example of such problem might be:

Block A and B are on the table; block C is on the block B. You have a robotic arm that can move the blocks. The objective is that block C should be on block B and block B should be on block A. Finally block A should be on the table.

Planners are program that can successfully solve such problems by returning a list of actions an agent has to do in order to solve the problem. For example the robotic arm might do:

put C on the table;
put B on A;
put C on B;

It would be awesome to have, in Unity or in Unreal Engine, a environment where a model of a robotic arm would graphically move the blocks as specified by the solution of the plan. Just to be clear: you do not need to mess with the planner: you just have to interpret its output. In artificial intelligence there are several "domains": the robotic arm that needs to arrange the blocks; a rover on Mars that needs to perform some samplings while recharging batteris; taxis an so on. You can choose whatever "domain" you like in this project, you don't need to realize a robotic arm