Categories
Uncategorized

Changing Settings And Frog-Fish

Good news for the project! We met with Professor Bailey and we are confirmed to get a HoloLens 2. What this means though, is that we have to meet together as a group and make sure we have the right configuration for using the HoloLens 2 and it’s MRTK. There shouldn’t be too many noticeable differences other than some QoL stuff from what I am aware of. But we are doing it to be on the safe side since there are some deprecations from the older version.

In addition. What I have found while emulating the HoloLens is that my computer cries in agony in terms of memory usage. So before I can get a proper video of the test program I have been running. I will have to get some more RAM most likely. Thankfully I have been planning on getting some to begin with. So I am ordering some and am hopefully having it arrive about the same time we do the trade-off with Professor Higgins to get our hands on the HoloLens 2 to begin testing the mapping function of an area of the HoloLens.

Other than this. I can’t really provide a good video and a screenshot does feel incredibly artificial as I could just doctor it. So as I fail to meet expectations again(and rag on my self about it), I will have to ask for the ever extended patience of those who read this.

Moving on from the project, my weekly DnD game has had an exciting turn of events. A player has essentially walked into the hands of an unknown entity due to the cuteness of it. It turns out putting something that you would want as your pet sidekick as a threat is a dangerous thing to do.

@Pomcentric on Twitter
From @Pomcentric on twitter

After a merfolk sorceress played a flute to a god in an altar. The cave went pitch black and all that could be seen was the light from a Warlock’s staff, and this little guy. After ominous music began playing the party began to feel scales and slime go across their bodies even though they saw nothing.

This creature stood still. The sorceress could not contain her excitement of the cuteness that this frog-fish contained and got closer and closer. The group warned her, but as if she placed herself under a spell she continued to close the gap between them. Upon getting close enough to touch the creature the sorceress felt multiple pulls of magic and what felt like invisible hands pull her into the water nearby. Succeeding in what the attacks of the mind what ultimately pulled her in were the invisible force at hand.

That’s where we left our session as everyone was in disbelief of what had happened. I had a lot of fun GMing it and my players squealed in excitement of what would happen next.

Anyways I have ranted on enough about DnD this week. Maybe I’ll just keep doing DnD stories since this isn’t necessarily supposed to be CS content to begin with. But I feel like I owe it to myself to talk about updates we have in the project as it keeps getting more and more interesting even though we are slightly behind due to requirements changes and now hardware changes in the project.

Categories
Uncategorized

A button?

The delays of technology and getting the thumbs up are finally out of the way and I have personally started on the project in terms of actual coding. Though, I am now realizing that a lot of the information I thought we had available to us in the form of Mixed Reality Toolkit(MRTK) tutorials are outdated. As in we are on MRTKv2 and all of the tutorials older than a year are for the MRTKv1. So there is more documentation I have to pour over than I initially expected. There are small enough differences in the syntax that I have to scour for the references rather than see some simple guides of implementing basic buttons.

I don’t have much in terms of code to show or any really outstanding picture like I always promise, but never deliver on. BUT! I am getting more confident on setting up the scene in general so hopefully I will have a crappy small clip of an object being generated due to a button being clicked! Yes, yes, very exciting to hear news about a button. To be clear the button would be the user pushing a button through the lens itself and not a keystroke if that makes it interesting.

I realized though that the part of the project we assigned to me somewhat had to wait until we figured out how we will generate and output the data from a csv file to our system.

Beyond that there is not much else to talk about in terms of the capstone project as my other classes have been taking up most my time.

That doesn’t mean I stopped playing DnD though(That’s a hint you can take off now if you just needed to see I am writing about something technology related).

I have started to become more confident in terms of creating maps on the fly and with speed while keeping them looking good. I made this map in under an hour when my players wanted to keep going in the session.

It is one of the drawbacks of using a map based system(rather than good ol’ imagination). I have found though that the map creation process has gotten a lot easier now that I know some tricks on making textures not look like they are repeating up close. You can tell though that from afar the ground is a texture that is repeating.

In the beginning of using DnD in a tool like Maptool I was worried that my maps would not hold any sort of longevity. My players though have proved me wrong. As they are deathly afraid that I will kill them at every single tile that is placed on the map.

Truth be told, I am terrified I will accidentally kill them as well. As much as I try and trust the math given to me to create encounters. I can’t help but think that I will make an encounter that just overpowers them in term of action economy. Being that, for the most part, each player usually gets 1 major action per turn beside using resources. So going forward I will try and create encounters that will utilize more resources to gain action economy rather than flat out equalizing in in the beginning of an encounter.

That’s it for this week. I will make a small video that shows an object being created by the push of a button through the emulation in Unity.

Categories
Uncategorized

Roadblock and Fantasy

Now that we have gotten our okay from our project partner we have started development. I have hit a couple roadblocks personally. Most likely due either an unfinished setup of the MRTK/Unity or and issue with how my Visual Studio’s have been setup. Basically, I just decided to start clean since I have more experience with actually managing and organizing my workflow in general. So no real examples(once again) of personal code that is working.

Since there’s not much in an academic/project sense to go off of, I will talk about my recent hobby that I have been spending my free time on.

Dungeons and Dragons! As stereotypical as nerds can get. I have been running a game in DnD for the last 4~ weeks on a tool called MapTool, an open-source Virtual Tabletop(VTT). What has been super fun about it, other than playing DnD, has been the macro and scripting versatility it contains. Even though I haven’t used java in a long time I still feel at home creating macros for players which simplify a lot of the overhead involved in DnD such as note taking, and calculating modifiers. It feels simple enough that I have recommended my friends thats are interested in programming and in the campaign currently to mess around with this tool to get a general idea of how variables and functions work.

The campaign is fairly homebrew. I am using Arcadum’s, a popular game master(GM) that streams on Twitch.tv, homebrew as a base for mine in terms of mechanics.

So far my players have loved the addition of the Martial check which gives players an ingame tool to help let their character determine any sort of martial prowess or tactics employed an enemy could take. As it is hard us as people to not know what a zombie or vampire is. As our player characters are not in the know how of how the fantasy genre works since they are apart of it, or they shouldn’t know at least. This creates interesting role playing opportunities as well. Getting a score of 1 could lead a character to believe that an Non Playable Character(NPC) is magnitudes stronger than the character leads them to believe. A 20(or higher) could let you determine all of the strengths and weaknesses just at a glance.

Another strength of the Martial check has been that of creating encounters for DnD. At the core of it. The game can be fairly simplistic when it comes to creating encounters for new players and new GMs. The addition of a martial check can allow for players to slowly unlock mechanics of a fight that would otherwise be impossible for a player realistically know without having it spelled out to them. Traits could be added that are unveiled in a fight that can help the player determine what kind of enemy they are fighting. Are they virtuous and will not finish you off when they down you? Or will they take the very first opportunity to finish off a character should they see the chance. It helps players and GMs inform themselves what the stakes of the encounter are in character and out of character.

I could go on about DnD, and I might next week too since it’s super easy to talk about. But I do not want to overload and write every single thing I like about the mechanics I have decided to use. Next week I will have proof that the project exists! And hopefully I won’t bore you with the rants about DnD.

Categories
Uncategorized

Starting Line

I hope everybody has had a good winter break as winter term has begun, and with that the return of the capstone project too. As a reminder, the team I am apart of is working on an Augmented Reality(AR) application that aims to show strain(now displacement) in real time. The intended users being in the Civil Engineering department at Oregon State University(OSU).

Winter term means that we are to begin development in full. Though, our project is slightly behind since we have a requirements change near the late end of the fall term. The changes were not drastic in terms of application scale or concept, but they changed what we are aiming to ‘solve’ with the application. As we are mainly looking for displacement of an object and showing the changes with a 3d object projected over the real object that is being experimented on. Similar in concept, but different in approach.

The beginning of the term though has had the team getting acquainted with the toolkit we will be working with to create the application using the Microsoft HoloLens(unknown if 1 or 2 still) and the toolkit(MRTK) that it uses bundled with Unity.

We have created a schedule in that our first sprint will be slightly shorter than anticipated since we did not get the ok to start the project from our project sponsor, Mike Bailey, yet(we have as of now though). The current plan is to meet with our project sponsor on 1/19/2021, and begin the second sprint on the same day since we now have the ok to begin implementing.

The aim of our schedule change is aimed to let us meet with professor Bailey at the beginning of each sprint and present him our current progress should he wish to see it and provide input as well. We thought it would be more beneficial to synchronize the meetings on the same day to keep feedback fresh and productive.

Otherwise, there is not much else to report on the project since we are just now getting into the midst of things. Hopefully I will have something interesting to bring up about the UI design then.

Categories
Uncategorized

Not Project Related

This week has been uneventful in terms of my capstone project. Getting some reading in, and I have a recording to watch from a free workshop I signed up for to get a HoloLens 2 environment, project settings, and model settings. I also have a workshop for Unity C# which should be useful once we begin implementing.

So, I’ll talk about some of the neat tools and technologies I discovered through the discussions and learning tools of other classes.

First off I am loving the coverage.py for unit testing. It has been incredibly useful in finding coverage of code. Especially in terms of branching and condition coverage. It can generate multiple forms of reports, such as html, that I can share with others(sounds useful when pair programming).

I have always had a fascination with unit testing as it feels like a foolproof way(as long as the tests are made correctly) to ensure that no major bugs make it through development. I’m sure other IDE’s and languages have some sort of similar functionality. Which makes me want to test more IDE’s like Jetbrains’s CLion to see it’s Out-Of-Box versatility and functionality.

Another tool that I found potentially useful, which I am sure there are more tools with the same function, is Visual Paradigm. The UML Diagrams that you can generate, or the code that you have written can be used to generate it’s respectful code and/or diagram.

I think this is super useful in order to check a codebase that one has to ensure that the way it is implemented is represented equally in UML format.

Likewise, if a UML diagram was generated at the start of the project(I am assuming a larger scale or more organized) you can then use the diagram to generate all of the respective files to a respective language. This can speedup development by preventing needless comparison for the creation of boilerplate code.

I’ll most likely try and use this sort of tool, whether it be Visual Paradigm in some upcoming projects just to get familiar with it.

We get to meet with our our mentor this week so I might have more details on the AR project then.

Categories
Uncategorized

This Sounds Easy…

As I further look deeper into the the documentation and development process of using the HoloLens 2 I find myself questioning if what we are doing is easy or not. I keep seeing cool features of the MRTK and think to myself “Doesn’t this just simplify X part of the project?”. Creating the objects and the triggers that they can have seem very simple with MRTK.

I hope I don’t get hit with a steep learning curve that I missed when overlooking the technology. From what I gather, the most difficult part of the project will be receiving the data from the strain gauges wirelessly. As of current I have zero idea of how that process will occur since it is not my requirement. Maybe some more research will help since it seems useful to know.

Otherwise, the project feels like it’s going stunningly in terms of creating the requirements, team dynamic and communication, and the ideas I have for creating mockups for menus, objects, etc.

Learning more about how to actually develop in the environment give me confidence as we have given out team roles to ensure we each finish the requirements in tandem. To ensure we each get experience with the multiple aspects of developing with a HoloLens, we divied up the critical and some important requirements equally in terms of technology.

Our last meeting with our team sponsor really gave us a more concrete idea of how the client’s current testing environment works. It opens us up to more options for creating objects and how they need to be anchored. Im beginning to think an open design that lets the user manipulate objects to their preference is the way we should be directing.

There is also the option of using some sort of occlusion to allow the user to see directly inside the object and see each strain gauge in an almost science fiction manner. Being able to see inside an object by creating a facsimile object in front of it to trick the user’s eyes is a neat idea I hope I can test a little bit while we are working on the project.

Next week I want to create a sample program and hopefully give a couple pictures on the blog. Until next time, stay safe.

Categories
Uncategorized

Term in Full Force

As classes have begun to settle into their course content in terms of introduction and preparing student’s for remote teaching I have been reminded of the pacing of school that can be incredibly rough. My canvas calendar appears to grow with due dates as I previewed what is to come. Nonetheless, the work must continue!

This week I wanted to talk about some of my preliminary findings on the HoloLens I discussed in last weeks post. I haven’t looked full force into the specific documentation as we haven’t confirmed if we are going to be using a HoloLens 1 or 2 so far. I have though seen many of the capabilities of the HoloLens 2 and am getting even more hopeful that we can get our –> hands <– on one!

There are many differences such as being able to use both of your hands to perform gestures. There is Eye tracking now, and many more gestures that one can perform now. I’ve also noticed that multiple commenting threads have been talking about it being much more comfortable to wear as well, which sounds important at the end of the day for the client.

While I would love to just give a huge list of the differences I should give a context of the ideas I have with our project should we get a HoloLens 2. Primarily it would have to do with many of the upgraded gestures. Being able to grab and manipulate the projections we make will be incredibly useful. As over time the user may find that they have a preference in where they want their projections being displayed. As I always find myself trying to make things more comfortable whenever I get into a new environment. Im still making adjustments to IDE’s I use and even to my own room layout. We all find a groove or specific layout we like over time, and giving the client the option to be able to setup specific layouts or sizes even for their testing environment sounds super useful.

Ergonomics aside, the differences CPU, RAM size (4GB vs 2GB) and improved WIFI capabilities will ensure that realtime updates are possible in order to let the user change tests or experiments at a moments notice. Maybe they notice odd behavior that they want to inspect before the strain potentially gets too heavy on the material and breaks it. Ensuring that no lag from the connection and no slowdown from the software itself seems paramount in what the user wants to achieve with this device.

I am not sure how difficult it will be on the processor and wifi(consistency at least in terms of wifi) to do what we want. As we haven’t been able to talk to the user directly to see what testing conditions are. For all we know there could be an incredibly large amount of strain gauges used and need to change how we approach modeling the data in general.

Otherwise I am feeling quite content this week in terms of what I have accomplished. I learned some about the differences in terms of developing in Unity and Unreal Engine, and while Unreal Engine comes off as more difficult(general consensus). I find it very appealing in terms of capability and technology.

For next week, we will have our first follow-up meeting and as much as I know we are waterfalling this project. But I hope we get to do some hands on programming and testing soon. So I will most likely cover more of my individual research of either one of the two engines, or ramble about class assignments and secretly reference my upcoming D&D campaign.

Categories
Uncategorized

The Start of the AR Capstone Project and Teambuilding

This week has been very tense, motivating, and captivating as we finally have our capstone teams assembled. I am on the Augmented Reality for Realtime Strain Visualization(AR-RSV) team mentored and partnered with Professor Mike Bailey at OSU. The team has had a chance to meet together as a group and with Professor Bailey. He has been incredibly helpful so far and has basically left himself open for us to ask questions. I will be going through what has went well this week and what I am understanding about my project so far.

First I want to share how well the initial team contact has been. And how I have found how ideal the week has been in terms of communication between my peers.

We all immediately found an instant messenger type service that we prefer in order to have instant feedback from each other. We chose Discord knowing it might not have some of the versatility or features that some other services offer like Slack, but it is what we are comfortable with.

We are testing Asana as a Kanban board and general task management system for our assignments as I have had experience using it before. So I suggested to the team and we have all agreed to use for the time being to see how the workflow manages. I am confident that we have gotten off to a great start in terms. As we have already helped each other become accountable by requiring peer reviewing papers as early as our first papers. Additionally we are sending our papers to Professor Bailey in order for him to see if we are missing anything to further improve or have feedback on our assignments. He wants to be included so he can aide us through the capstone project and ensure that as long as we do the work, we will succeed.

The environment that has been created so far, even if it may be a honeymoon phase, has been great. A solid organizational foundation feels great compared to other group projects I have had at OSU. Previously I have had group projects at OSU feel very unorganized and almost confusing. Where messaging is done through Direct Messages(DM’s) and communication can happens days apart with no real update in progress. So I am somewhat more at ease going into this term than I initially was due to the clear organization of our team.

The general idea of the project is as follows: We will be programming a Microsoft Hololens 1 or 2, a mixed reality device, to display contour lines or heat maps in real-time as a concrete test is occurring. This will allow our client to be able to change tests as they see fit based on the values they see. The initial project meeting went well with Professor Bailey as he gave us a clear vision of how the product is to work and we will be able achieve it. We are pending to see if we can get a Microsoft Hololens 2 though. Professor Bailey stated that the differences from the first model and second greatly reduce the burden on the programmer in terms of managing software.

I am still in my preliminary stages of researching about the technology. Though I have learned enough to know that we can embed markers in images to be able to use as anchors for creating visualizations.

Based on the information I have synthesized from the meeting my initial idea is that as long as we add some sort of marker to each strain gauge, or an area that is suitable to represent it’s location, we can display that data whenever the user looks at it to show the current strain value. there is also the option of having a location based device in order to map the strain gauges, but it seems much cheaper and feasible to simply have a printed image on a location.

Although it is the first true week of our project, I feel that our team is poised for success with the amount of work we have already put forward to the organization and structure of our team.

Next week I expect to have learned more about the Hololen’s through reading it’s SDK to describe some features and possibilities.

Categories
Uncategorized

Breaking Rules and Looking Forward

When I transferred to OSU I originally had the intention of staying away from computer graphics. Or much of anything that could be perceived as focused in the gaming industry. A rule I wanted to hold to myself to when I decided to go to a university for computer science(CS). I want make sure that I was interested in CS and not the concept of working on a game. I felt that I had used so much of my time in high school gaming and neglecting any sort of studies in general. So I needed to make sure I wasn’t lying to myself about being interested in CS.

I am going to break this rule.

Not due to how I have learned a great amount about the field. Or how I enjoy the process of programming and problem solving. I am breaking it because of my interest of the technology for my chosen senior capstone project. I feel that my curiosity as an engineer has taken it’s roots deep and any concern for my studies no longer exist.

The capstone project I am in is Augmented Reality(AR) for Realtime Strain Visualization mentored by Professor Mike Bailey at OSU. The moment I read it in the list of projects I had an immediate interest from the cross discipline nature of the project and the use of AR to solve a problem. I have also had a very good experience listening in on some of Professor Bailey’s lectures the previous year while waiting in-between classes.

So I am looking forward to this year. The expansion of my knowledge in CS from just AI and algorithms leave me excited to see what else I become interested in.

Categories
Uncategorized

Hello world!

Welcome to blogs.oregonstate.edu. This is your first post. Edit or delete it, then start blogging!