NASA's Mars robot explorers use a 6-wheeled 'Rocker and Bogie' suspension arrangement to allow them to climb over boulder-strewn surfaces. This project will investigate this suspension design, and propose modifications (or a completely new design) to allow navigation over typical Earthly surfaces, such as kerbs and stairs or rock outcrops like you would find at the seaside. Your approach to design will be informed by your literature study; there's lots of design theory out there. You may choose to create a simulation (e.g. using Unreal-4) or you may wish to skip this step. Finally, you will create a suspension system, using a mix of laser-cut and 3D-printed components, and you would test this on a range of terrains and obstacle sizes and separation. Your conclusions could provide some design hints for future suspension research.
Vision is an incredibly powerful way for robots to navigate, detect and deal with objects, read instructions (e.g., bar-codes), and generally see. If you don't believe me, close your eyes (I am not allowed to say 'and walk around' since this may put you at harm). Up until quite recently robot vision has required the use of powerful microprocessors, well beyond the processing power of our beloved Arduino technology. But this has changed with the introduction of dedicated camera-processor boards which communicate with an Arduino using the high-speed SPI interface, such as "Pixy 2". Even more exciting is the introduction of professional Arduino boards, the 'Portenta H7', a dual-core Arduino which still uses the Arduino API. There is also a Vision Shield (with built-in camera) which is capable of Machine Learning using OpenMV and micro-Python. Another board is the AURIX ShieldBuddy TC275 which is a tri-core board (with the Mega2560 pinout), again this can be programmed using the standard Arduino API. These boards operate at serious speeds, 200MHz for the AURIX, 480MHz for the Portenta; compare this with a typical 16MHz of the Mega2560, a real leap!
In this project you will study these advanced microcontroller boards (and others) and you will also research applications of machine vision to robots. Then you will chose a board and a problem to solve, and design build and test your solution.
Primary research will involve collecting data from your system and evaluating its performance against your specification. Alternatively, your primary research could be based on your design-build-test process, where you record and evaluate the success of each stage. Please note I have little experience of using these boards, yet.
There are many ways to create a biped robot, and each way has many challenges. Thinking about the design, just how many joints do we need (hip, thigh, knee, foot)? Indeed what is the smallest number of joints we need to create a walking biped? These questions will be explored in your literature review. Also you must consider the required robot behaviour; is walking along a straight line sufficient, what about going around curves? And is the behaviour restricted to walking? What about hopping or skipping or running? Indeed what are the biped gaits observed in the natural world and which of these can be transferred to a robot? Any which of these are useful for a robot. Walking up and down stairs, walking over an uneven surface? Again, all stuff for your literature review.
This project will use an Arduino (or other) microcontroller, and you will need to fabricate the robot components which will need 3D printing (available). The controller code could be straightforward procedural (where you will instruct each joint to move) or using Central Patter Generations, small neural networks inspired by living bipeds.
Primary research will involve collecting data from the robot and analyzing this according to a planned investigation, e.g., how do various parameters affect the speed of movement, which parameters can generate walking, running and other gaits.
There is currently a renewed focus on travel in space, evidenced by very recent Mars landings, and flights to the space station. This project will use the UE4 engine to simulate the flight of a spacecraft in a restricted part of the Solar System, e.g., Earth-Mars or Earth-Moon. It will extend a previous Computing Project where the Solar System was simulated without a spacecraft.
The project could take many forms; you could simulate launch from the Earth into orbit around the Moon or Mars, or you could focus on the descent and landing phase of a vehicle. This could be done using an autonomous controller, though you may like to give it a more game feel with a human in the driving seat. The maths behind the model is known as the "restricted three-body" problem.
Primary data will be the trajectories of the spacecraft motion, and the planets positions. This could be validated against real-world NASA data.
Moving towards a carbon-neutral energy balance we are looking for alternative ways to harvest energy. Vibration Energy Harvesting (VEH) is an emerging area of research where periodic motion is converted into electrical power. Think about where vibration energy is wasted: One example is shock absorbers on car suspension, where motion is converted into heat. Why not re-design the shock absorbers to produce electricity? Think about a train; when it travels over a section of track, the track bends down for a while. Why not convert this bending into electricity? Or put a device into your shoe so when you stomp around, each stomp generates electrical power.
There are two main forms of VEH, magnetic and piezo-electric. This project will explore one or both of these technologies. You will conduct a simulation, and if you want, attempt the construction of a physical device. Primary data will be collected from the simulation or device measuring the amount of electrical power generated for various amounts of vibration
When you play a game, how do you decide where to go next? Which door to go through, or which staircase to go up. Which path to take out in the open? Here you will study theories of perception and cognition and apply these to designing a game level, or modding one with additional assets which should influence the players' decisions. The preferred engine is Unreal-4, but this could be discussed.
Primary research would involve getting a load of folk to play your level(s). You would make a video log of their behaviour and subject this to analysis.
Here you will design, build and test a two-wheeled educational robot from raw components, to produce a cheap but sophisticated robot to solve problems such as maze following. You will select an appropriate micro-controller (e.g., an Arduino flavour), the drive system (stepper, servo or dc-motors with encoders) and a range of sensors. You will design the chassis and other mechanical components in CAD which will be laser-cut or 3D-printed for you. Then you will code a solution using a suitable IDE (matched to your microcontroller choice).
Primary data will be collected from the robot. Here you will compare its actual performance with the desired performance. A number of tests could be made; accuracy of moving on an arc of set radius, accuracy of navigation between obstacles.
In this project you will code a simulation of a wind turbine which adapts to the current wind speed in order to achieve optimal performance. This is achieved by varying the generator load on the turbine, and adjusting the pitch of the blades for higher speeds. You will model the CART-3 research turbine from the US National Renewable Energy Laboratory, details and tons of data are available for this turbine. You will verify your model against published research.
Primary data will be collected from the simulation which will be compared with published data. The second phase of the project will look at investigating the layout of wind farms. Here you will change the spacing of the turbines and their geometrical layout, and measure the farm efficiency as these are varied.
Take a load of servo-motors and 3D-print some enclosing brackets, connect it all up and you have a snake robot. Of course you'll need to add an Arduino and write some code. This is where things get interesting; you can write a procedural controller where you tell each segment how to move, or you could code a Central Pattern Generator where the segments self-organize their movement.
This is a hot topic in robot research. People are looking at different modes of locomotion (slithering, side-winding etc.), other folk are looking at applications such as inspecting pipes. So there's tons of literature for you.
Primary data would be collected from the snake-bot, e.g., what parameters influence its speed, mode of locomotion and other things you will imagine. Check out Will Donaldson's work https://www.instructables.com/Bioinspired-Robotic-Snake/
Over the past years we have created a huge number of UDK levels for use in teaching and research at Worcester. Examples include rehabilitation of stroke victims, footfall planning for architects, virtual labs for learning physics and also mechanical engineering.
We are now in the process of 'porting' our extensive UDK assets and code into the Unreal-4 engine. Some baby-steps have been made; we have a decent draft coding framework but there is much work to be done. This project will focus on transferring materials, static meshes and terrain from UDK into Unreal-4. In addition you will work out how to create HUDs, on-screen graphs and plots, and work on the user interface. All of this must be linked into the code, which is written in C++.
The project could be situated in the context of (i) creation of a game, (ii) creation of a simulation. You may know that the University is working towards the establishment of a Faculty of Medicine. Simulations linked to medical training would be potentially interesting.
Primary data could be collected from your development process; you could keep a learning journal of problems encountered and solved (including details of fixes).