With only three weeks left to go we’re in to the final push before #PiWars.
Today has seen the boys running tests for driving over different terrain. Which has included ramps, small steps and marbles. So far we’re doing much better with this years four wheel design over last years two wheels plus caster. We’re far from being stable but we can at least tackle more things without getting stuck.
Coding wise has also seen a focus on using the Pixy for colour recognition, plus employing a distance sensor to stop us driving in to things. There was some stress (on my behalf) last week on the colour matching when the code just stopped working as we got near the corners (when initially it had been happy). This appears to have been related to the auto white balance settings on the Pixy as after turning those of performance improved again. I guess I must have fiddled or reset the settings at some point!
So now the only challenge we have no plan or code for is the maze. The hope is a combination of spotting the Aliens and the distance sensor can work here. But at the moment the first time that code gets a proper run (or any run) is likely to be on the day!!
With the help of mum the boys have also spent some time on starting to make the outer shell look a bit more in line with the theme and “spacey”. Hopefully there’ll be time to make a few more additions there before the day.
We’re all looking forward to seeing all the robots on the Sunday and catching up with other competitors we’ve met in real life or been following on Twitter. Till then… back to that last push!
We had the great opportunity to get down to Cambridge Makespace today for the Robot Club that Brian Corteil has been running so that we could try out our robot on some of the autonomous courses and catch up with the other contenders.
As the videos show there’s still room for improvement, especially on the Nebula challenge but quite satisfying to see that we weren’t far off.
It’s been great to attend these and learn from and feel part of the wider Raspberry Pi community. Especially as there’s lots of very good robots in the making for this year’s competition. Thanks for all the hints, tips and loan of the odd screwdriver that people freely shared with us today on how we could improve further in the remaining days until the competition.
Unfortunately we didn’t get a chance to also pop into town for a visit to the new Raspberry Pi store, that’s planned in the diary for tomorrow now. Must resist ideas for new projects though until after #PiWars.
I have now started looking to code up some of the logic for the automated challenges. The first one I’m aiming for is Blast Off, the straight (ish) line challenge. For which I’ve mocked up a (colour reversed) course using some lining paper and black electrical tape.
Last year I attempted this with multiple HC-SR04 sensors. But we had a bit of a wobble on the first run, crashed and broke the connections enough to just crash the code. So must aim to write fail safe code this year!
For 2019 we’re looking to learn how to use a Pixy2 for the challenge. The initial coding attempts were a bit shaky but by the end of the evening I managed the run in the video above. It was a bit of a fluke, hence not expecting the robot to turn at the end.
But this is all good encouragement that things are heading in the right direction especially as there’s only a few weeks to go. Hopefully with some more studying of the vector data that’s coming back I can look to guarantee the turns and ramp the speed up a bit.
More work was done today on the design and layout of sensors on our robot. We need to start thinking about a “shell” at some point to try and make things a bit more space like. But that might have to wait whilst there’s a big push on the software side to mean we can at least make an effort on the autonomous side of the challenges.
Seeing some of the other robots being worked on and their designs and custom boards is giving me a bit of a feeling of being an imposter and out of our depth in this world. But we’ll keep pushing on with our basic knowledge and continue learning as we go and see where we get to.
An expected donation from Santa has led to an upgrade on the camera front for Diddybot. Now to start working on reading up more on how the Pixy2 works and integrating it into the code.
Things are moving along although a bit slower than I’d hoped. We’ve now had a bit further redesign, hiding all the batteries on the lower level and having the Pi on top. I also discovered I wanted access to more pins than the Picon Zero was allowing. So we’ve added on a ModMyPi Triple GPIO Expansion Board. I’m sure there might have been other ways to do this, but I was wanting to go simple and quick.
Adding this has allowed us to add on a row of Blinkt! lights (a prize from our first attempt at PiWars last year) which we plan to use for status display (or just to blind the other competitors) and then also next to add on are the VL53L1X time of flight sensors. These also need the use of a TCA9548A I2C Multiplexer so that we can have multiple ones attached even though they all use the same address. At the end of all this I plan to write up a post of parts – although then I might start to realise how much this fun hobby costs.
Then as you can probably tell from that ribbon cable, we want to look at adding on a Pi camera as well (another part of our prize from last year). Although at the moment I have no idea how to code up the use of that, so hopefully Santa will bring me some tips.
Meanwhile, I’ve set the boys the task with coming up with a design for the body. I’m trying to keep them along the theme of space, but they seem more interested in baby sharks at the moment.
In the last post we noted that we were having controller issues. Most of our time at robot club was spent looking at the basic controller code trying to understand what may be the issue. Deciding that we needed some debugging assistance we put in a print out of what information the robot was receiving from the controller to see if this would point us at anything.
In the way these things go, the addition of this single print statement to write out the joystick position every loop saw the robot suddenly start behaving perfectly. So we removed it again and immediately saw the erratic behaviour again.
So now we have a sleep(0.1) statement in there instead. We’re assuming that we were just confusing things by polling the controller to often, or sending commands to the motor controller board to often. Still now that we’re having a little sleep every so often things are fine.
It just confirms what we’ve all been told, a little bit of sleep really does help you work better!