Onward

We’re now approaching the end of the quarter. Projects are in full swing and the personal calendar is filling with holiday activities. Certainly an exciting time in every academic year. This also marks the end of my time at OSU. Looking back, it has gone by so quickly and from my perspective, it is staggering the amount of information that I’ve gotten to cover in the last two years.

The OSU CS program has been highly effective in the goals that it sets out to achieve and I value my opportunity to participate in it. This whole journey started with a friend off-handedly mentioning that he was in the program and that if I was interested to maybe check it out. It is crazy to think that that casual suggestion and a couple of Python tutorials sent me down the path of another degree and a drastic career change. This led to an internship with a large tech company and a subsequent offer to return full time in March next year.

I’m really looking forward to what is to come. With winter approaching, I’ve got travel plans for the holidays and a completely packed ski season in the US, Canada, France, and Austria, leading all the way into the beginning of my new career as a software engineer. All in all, things are well and I can’t wait to see what else the future will bring.

A Fall Seattle Day

It’s a rainy Thursday morning in Seattle. All last night there was a constant muffled rhythm of rain on the roof of the house. Perfect weather for a hot cup of coffee, turning the furnace up a couple of degrees, and working on some homework before heading in to work this afternoon.

A cheap economic barometer:
Peet’s coffee is usually my go-to for moderately priced, large batch, buy it anywhere coffee. The reliable workhorse. But as of this week the price has doubled since no more than two years ago. It used to be that you could buy a twelve ounce bag of Major Dickason’s for $5.99, occasionally $6.99. First the standard bag size was cut back to 10.5 ounces and over the last year and a half or so, the price has slowly risen and sale prices would become less frequent on this particular brand. But this week, I was truly shocked to see the same bag of coffee selling for 11.99 at the grocery store down the street where I’ve done most of my shopping for years. I settled for Seattle’s Best Coffee, which, it is not.

The Coming Week:
This is going to be a busy week for the project. We are at the point of working through the image collection problem. My tasks involve working on the image stitching and layout creation methods. It’ll be a fun challenge working with some tools that I’ve never used before to assemble the photos that we’re taking. To do this I’ll be using OpenCV Image stitching to merge overlapping photos. This is straightforward enough for our uses but does come with some constraints such as being sensitive to order and orientation. However, it gets challenging when considering that there may be more than one distinct group of photos that will be taken in a program run. We will need to be able to stitch together each group and then arrange them together as they are truly laid out with gaps filled with empty space in the final image. It looks like there are some tools which will help to create the image, but to derive the layout we’ll also need to use the recorded coordinates the images assign a reference point for each stitched image to specify the layout. Just a generalization of some of the initial thoughts.

Motion and Iteration

It seems like every week’s tasks have (gratefully!) included learning and applying something new in the eGreenhouse project. In the last few days this has been setting up the movement commands for the CNC controller which moves the camera and sensor module in the x and y directions along the tracks. The structure of this project and the hardware being used has been set up so that we can send movement commands in a pre-defined format. The CNC drivers have already been set up and subscribe to the GCodeFeed topic. The message format for this topic is a single string element. A controller node which accepts the strings and sends them through the specified serial port also exists. An ESP32 running Grbl is connected via USB to the port that we want to send instructions to and expects to receive new line separated G-code strings.

There are many G codes available for various processes which include movement direction, path geometry, tooling control, and the list goes on and on. For this project we are mainly interested in linear motion to make the end effector with the camera and sensors travel along a list of destinations to collect data. The two G codes for linear movement are G00 and G01. G00 is a rapid movement which just tells the controller to move to the destination as fast as possible without performing any work. G01 tells the controller to move to the destination at a specified feed rate and allows for concurrent operations such as milling or extrusion in the case of a CNC mill or 3D printer. For now I’ve been primarily concerned about sending properly formatted codes to a mock node. Since I’m working with a mock, the code at this point is arbitrary and I’ve been using G00, but ultimately this code could change.

The point of today’s post is to talk about the requirements that we have been given for sending instructions to the ESP32 and a couple of iterations I’ve gone through to get to the point I’m at right now. Our goal is for the user to be able to select a series of locations in the interface. The interface should convert those into absolute x-y coordinates measured in millimeters and store them in the database. The ROS program can then read the set of locations (referred to as waypoints), tell the controller to move to each waypoint and perform the associated action at each. Although we could send the waypoint coordinates to the controller with no intermediate locations with the end effector arriving at the specified destinations (assuming no errors), we have been tasked with splitting up the distance to provide some more flexibility like inserting additional commands into the path or interrupting the movement operation.

The first step that I took was to make the end effector mock “move.” Although it is just a mock, it stores a location and has a means of moving about virtual space. First, to make sure that the mock would substitute the ESP32, I made it subscribe to the same topic that the ESP32 will use, parse the G Code, and use its controller function to move to the specified location. This is just a simple x-y coordinate and it just increments or decrements until the current location is the destination. While the mock is “moving” it is also publishing its location to the topic that the actual ESP32 will be reporting the real location to. Success! In iteration 1, the communication channels are functioning, the mock is working as needed, and the MotionNode is telling the ProgramNode when it reaches the destination.

Next was to create some intermediate G Codes. In this step I wanted to get the destination split up into a series of steps, each 5 linear units in length. When sending codes to the actual ESP32, units will be in millimeters, but since this is just a mock, they are arbitrary. In this iteration the virtual end effector moves in the x direction first, then in the y direction. If the case were that we wanted to move from (0, 0) to (21, 14), the MotionNode would publish the destinations (5, 0), (10, 0), (15, 0), (20, 0), (21, 0), (21, 5), (21, 10), (21, 14) as formatted G Codes (G00 XX, YY, ZZ) to the GCodeFeed. Once again, success! The ESP Mock can move in virtual space from its location to the destination passing through each location published to the GCodeFeed.

Although now we’re “moving” along a segmented path, this still seems somewhat inefficient. Why should we waste the time of traveling to the user’s selected destination in an L-shaped path? Next is to move 5 cm at a time directly toward the destination. Or, in other words, time to throw in a little bit of geometry. In this iteration we are going to calculate each intermediate x-y coordinate as a 5 unit increment along the direct path from the current location to the destination. To do this, we’ll just use right triangle properties to calculate the angle between the x-axis and the destination as Θ = tan-1(Δx/Δy) and the length of the hypotenuse as √(a2+b2). The x component of the distance will be the length of the segment (5 units) times cos(Θ). Similarly, the y component will be the length of the segment times sin(Θ). We’ll then take the floor of  the length of the direct path divided by the segment length. Each intermediate coordinate will be each integer from 1 to the calculated floor times the x component and times the y component. We’ll then publish each of these destinations as G Code formatted strings to the GcodeFeed to be read by the ESP Mock.

So, the question now is… does it work? All I can say right now is we’ll see. The concept should be sound, and as long as I haven’t mixed up any calculations when writing the code, it should work, or at least within the tolerances of the system. More to follow next week!