Funding

Self-funded PhD students only

Project code

CCTS4520219

Department

School of Computing

Start dates

February and October

Closing date

Applications accepted all year round

Applications are invited for a self-funded, 3-years full-time or 6-years part-time PhD project, to commence in October 2019 or February 2020.

The PhD will be based in the School of Computing and will be supervised by Dr Jiacheng Tan and Dr Zhaojie Ju.

Given the advances in robotics and AI, most commercially available robots have been meticulously pre-programmed and trained for the predefined tasks. Such robots can out-perform human in environments where the tasks and work-spaces can be fully controlled and modelled as in the case of industrial applications. In less controlled and hazardous environments, expert operators have to be involved in the control loop to compensate for the inadequate abilities of robots in perceiving, understanding and assessing the surrounding environment and in planning the operations tasks.

Without a breakthrough in AI, this reliance on human operators is unlike to change in the near future, which severely limited the use autonomous robots in hazardous and domestic environments such as underwater rescue, firefighting and healthcare even if the hardware platforms are adequate.

In this project, we aim at a novel approach to enable robots to learn and understand their surround environments and to be tutored to perform diversified tasks by non-expert users using natural language.

By relaxing the stringent requirements on robot perception system and by relieving the demands for the specialised knowledge and skills of robot training, it is anticipated that the work would make it possible to deploy dexterous robots with higher accessibility and at lower costs for domestic and hazardous applications where both task operations and environments are varying and cannot be fully pre-defined.

The research will be based upon and extend our work on robot task and work-space analysis through grounding the spatial relationships and quantifiers in natural language. The work will involve natural language analysis, machine learning, knowledge modelling, intelligent robot control and task scheduling. Candidates with an interest in AI and robotics and a good undergraduate or master’s degree in computer science, engineering, or other relevant subject areas are invited to apply.

Funding

PhD full-time and part-time courses are eligible for the Government Doctoral Loan (UK and EU students only).

2019/2020 entry

Home/EU/CI full-time students: £4,327 p/a*
Home/EU/CI part-time students: £2,164 p/a*


International full-time students: £15,900 p/a*
International part-time students: £7,950 p/a*

*Fees are subject to annual increase

By Publication Fees 2019/2020

Members of staff: £1,610 p/a*
External candidates: £4,327 p/a*

*Fees are subject to annual increase

Entry requirements

Entry requirements

You'll need a good first degree from an internationally recognised university (minimum upper second class or equivalent, depending on your chosen course) or a Master’s degree in an Civil Engineering or related area. In exceptional cases, we may consider equivalent professional experience and/or Qualifications. English language proficiency at a minimum of IELTS band 6.5 with no component score below 

We welcome candidates with an interest in AI and robotics along with a  good undergraduate or master’s degree in computer science, engineering, or other relevant subject areas.

How to apply

We’d encourage you to contact Dr Jiacheng Tan to discuss your interest before you apply, quoting the project code CCTS4520219.

When you are ready to apply, you can use our online application form and select ‘Computing and Creative Technologies’ as the subject area. Make sure you submit a personal statement, proof of your degrees and grades, details of two referees, proof of your English language proficiency and an up-to-date CV.  

Our How to Apply page offers further guidance on the PhD application process.

This site uses cookies. Click here to view our cookie policy message.

Accept and close