Oculus Quest: The VR headset I’ve waiting for!!

 

I have been a 3D interface geek since long, with a passion for 2.5D and 3D interfaces design and programming… got one of the first Kinect (for PC) and Kinect 2, and also played with the Oculus SDK1 and 2…

For me, I faced a solid wall with my nausea when entering VR… so tried to focus on the AR with Hololens, so I got tickets and flight for the Build conference when it was announced a possible showcase of HoloLens in Build 2015showcase of HoloLens in Build 2015, which finally happened and I even got invited to a private preview and hands on programming…  but also was lacking to me some key factors.

I have since the Oculus crowdfunding been looking and trying all the different headsets, just to find issues (nausea), lack of responsiveness or not enough resolution, etc.. with the HTC Vive, later versions of the Rift…  until now.

Meet the Oculus Quest

OQ

Short, the Quest is a standalone VR with OLED displays with 1440 x 1600 px per eye, 72Hz refresh, Qualcomm Snapdragon 835 processor with 4Gb Ram and 6 DOF. 571g in total.

I will keep it short, its main features:

  • Simple and easy to put on.
  • 6DOF this is important, see this animation:
    • giphy.6DOFgif.gif
  • Nice resolution, good framerate and IPD adjustment (very nice to have and essential IMHO).
  • Inside out tracking that works Flawlessly (even when moving the hand behind your back – wow!)
  • Accurate and fast tracking, without any perceptible lag or delay –  check how it is able to handle “Beat Saber” at highest difficulty or see it in a video.
  • Secure, you can define a security “play/interaction” area around you and if you approach you receive a haptic and visual feedback. And it works really well.
  • Wireless. No cables. No obstacles. Nothing but freedom.
  • PC Free, it is a standalone device.
  • This last thing comes at a price but it is barely noticeable and the Qualcom.
  • Surprisingly good audio.
  • Some “Mixed Reality” support.

And not a feature but I have severe motion sickness and I can stand it for half an hour to one hour. this is accomplished with all the factors (features and how they have programmed some of the games I’ve played). And after 1h, I am not dizy or have nausea.

Veredict:

Shortly said: A VR Revolution, a sum of good ideas brought together and implemented to work together in a brilliant way. Simply Wow! ..and I mean a bold WOW!!

If you are interested in VR for gaming or for developing, this is it. It is easy to setup, no wires, no expensive PC, it is FAST and accurate, the visuals are really good and also it is fairly easy to start programming with it.

Also, the Inside-out tracking makes it , apart from easier to setup, far more cheaper than earlier VR models where you needed a couple of sensors “lighthouses”.

For me and my opinion, the Oculus Quest is cost-efficiency speaking is the best VR headset of the market. It does the work, with good resolution, refresh rate, without some of the major showstoppers of the recent past (needing a PC and the setup).

To my opinion, it makes VR easy and mainstream. I guess that’s the reason why it has run out of stock in many suppliers and has sold $5 million in content in the first two weeks… or the Superhot sales are 300% higher for the OQ than they were for the original 2016 Rift Launch, so it is an estimation on how the Quest is performing in rellation to the original Rift.

Note that if you are interested, you’d rather act quickly, it is out of stock in many places (US in as short period as a week) and I got mine quickly due that I got the 128GB version… and it is estimated that 2019 will see around 1M sales

 

Development:

Development for the OQ is great, easy to set up and dive in. It simply does work.

I have on the past two weeks unrusted my Unity skills and I am playing now in learning how to use the “Oculus Integration” tools and integrating the VR with some service I am setting up in the cloud with Azure, to interact with it in brilliant 3D.

And so far, the experience is brilliant and I am having a lot of fun 🙂

only hint I can give is to invest in learning how to apply object pooling as resources are limited, but that is common sense though.

for more information, go to developer.oculus.com 😉

 

Future:

OQ, Oculus Quest supports Mixed reality so it has a mode that you can locate the controllers with a generated view of your surroundings, with some other minor applications.

BUT and is a big BUT, Oculus is actively working on mixed reality scenarios as well as collaborative interaction, as shown on the following video from the recent “Oculus Connect”:

And..

Also interesting to see how Oculus Insights technology works:

Note: Insight was earlier referred as “project Santa Cruz”, which Oculus has been working in since 2016…

So, when is “Oasis” coming? 😉

And… I have a feeling that the upcoming “Oculus Connect 6” in September 25-26 will be worth watching…

 

 

 

 

 

 

 

 

Implementing a Strategy to “rule them all…”

The Strategy pattern

The Strategy pattern is one of the OOP design patterns which I like the most..

According to wikipedia, “the strategy pattern (also known as the policy pattern) is a behavioral software design pattern that enables selecting an algorithm at runtime.” – source

This UML diagram showcases it Pretty well:

Why I like it?

I believe there are several reasons that make this design pattern one of the most useful ones around..

  • Improves the KISS (Keep It Simple and Standard) principle on the code.
  • LSP – The “Strategies” are interchangeable, can be substituted by each others. This is a clear aplication of the L in SOLID, the Liskov Substitution Principle or LSP: “Objects in a program should be replaceable with instances of their subtypes without altering the correctness of that program.” – Source
  • Open-Closed – The Strategy implementation through an interface is a clear application of the “Open-Closed Principle: “software entities (classes, modules, functions, etc.) should be open for extension, but closed for modification”. In this case we can extend it by writing a class that extends the behavior but we cannot modify the interface. Even more, if we implement the Strategy using a Plugin implementation we do not even need to modify its source code. It’s a very clean implementation. It also helps into Decoupling the code and responsibilities. – Source
  • SRP : We can strongly affirm also say that the Strategy promotes the Single Responsibility Principle as each Strategy implementation should be implemented in a single class.
  • DI: And also, the Dependency Inversion Principle: “One should “depend upon abstractions, [not] concretions.”  – Source. This is so as the Strategies depend on abstractions, an interface which defines the strategy.

 

We can easily implement the Strategy pattern with Dependency Injection but this makes that the code of the Strategy to be on the same assembly or executable and thus, coupled. Due to this, I consider this as a sub-optimal implementation which does not fulfill the Open-Closed principle at 100% if we consider the main Executable as a “Software Entity”.

Even more, if we are in a highly regulated Environment, this means that we can add functionality without altering “the main software” which might be subject to a regulated process like FDA-approval, in case of a medical system… that means several months of documentation, testing and waiting for FDA to sign everyhting. 

Do you like it already? Wait – there are more Benefits!

In my previous work, at RUF Telematik, I proposed the application of this pattern with a plugin system as part of the Technical Product Roadmap. Basically to decouple the code which interfaces a concrete hardware (type of HW, manufacturer, version…) So the main software would not need to know how to talk with a camera, monitor or communication device in the train System. The responsibility is delegated to a .dll plugin that knows how to do that work and we can dynamically add These Features without altering the main software.

In addition to the software architecture benefits and the code quality, we have some more benefits:

  • We can parallelize the development of the Hardware Manager dlls to different developers who can test them separately.
  • We can separate the release and test workflows and accelerate the development time.
  • We do not need to test the main software every time we add support for a new device or version of a device firmware..
  • We do not need to revalidate through industry standards the full software over and over again (usually with an substantial cost of time and money)

In a train we could categorize the different hardware on it on hte following four categories:

  • TFT screens
  • LCD Screens
  • RCOM Communication devices
  • Camera devices

Each one has different Vendors, models and version numbers so a bit more complex implementation should be needed but this is an example so we do not Need to build “the real Thing”.

So we could implement an interface like ITrainInstrumentManager that supported methods like:

  • Connect
  • Getversion
  • Update
  • Check
  • ExecuteTests
  • UpdateCredentials
  • and so on…

 

And then implement a Strategy that fulfills this interface for every Type of Equipment, for every brand/vendor and for every model and version…

With the added Benefit that I can parallelize this work and get several persons to work in different Strategies, one for each new device. This would enable to add support for new hardware devices in no time at all.

And they could be tested separately, with the warranty that if the tests do work, the main tool would work as well.

Without altering or releasing the tool, just adding the plugins in the corresponding folder or loading them dynamically from an online service or location. (if we implement the strategy using the plugin technique, of course)

This presentation showcases some of the points mentioned above, if you are still curious.

 

Implementation of the Strategy Pattern

One of the best implementations I have ever been part of is when I worked along 2011 and 2012 at Parlam Software where a plug-in architecture was designed and implemented by my now friend Xavier Casals. Then he was my customer and CTO of Parlam (and still is).

<Commercial mode>

If you are in Need of translations, do check their solution. Basically it is a full fledged TMS (Translation Management System) which automates your language translation process. More on this here and here.

</Commercial mode>

This Plugin system enabled adding dynamically Data convertors to third party systems, like different CMS systems as “SDL Tridion”, to where his service connects and works with, so basically he can deliver an interface to anybody that wants to interface with its system and enables an easy implementation as well as testing and deployment. Once the DLL is tested and verified, can be signed for security reasons and added to a folder where it is magically loaded and we get the perfect implementation of Open Closed Principle…

“software entities… should be open for extension, but closed for modification”

I know it is a lot to say but let’s get it done and you tell me after 😉

 

Structure

We will create a .NET standard solution which will have implement 3 Projects:

  • StrategyInterface –> A .NET Core Class Library that will hold the Strategy interface, two custom attributes and a custom exception for managing the plugin. This is the basic contract that we will share between the main application that will use the plugin(s) and the plugins themselves.
  • Plugins –> This is a project with a simple class that implements the Interface on the StrategyInterface Project/Assembly. I will use the two custom Attributes to add a name and a description so I can programatically go through them with Reflection before creating an instance, which is convenient if I want to avoid creating excessive objects. Note that this Project will have different implementations, in our case I created 4: CAM, LED, RCOM and TFT. Each one will create a DLL in a concrete directory, “D:\Jola\Plugins”.
  • StrategyPatternDoneRight –> feel free to discuss with me on the name, comments are open to all ;). This is the main customer/user of the Plugins that implement the Strategy and will load the plugins that match the interface from a concrete location of the filesystem. At the moment I did not put too much logic but just to load all the matching assemblies and execute a simple method which all the plugins provide.

The solution looks like:

Strategy01 structure

StrategyInterface project

The most important here is the interface that defines the Strategy:

Strategy02 interface

There we will create the custom Attributes, one for Name and another for Description:

Strategy03 cust attrib name

Plugin project(s)

I created a Plugins Folder to contain them all, then created .NET Standard assemblies and added a reference to the StrategyInterface Project.

Lets say that we create the CAM.Generic Project to implement support for the Train Network Cameras… there we add a class which implements the Strategy Interface and we add the two custom Attributes to it:

Strategy04 Plugin Strategy Implementation

Obviously this is a simplification but here we would put all the hardware dependant code for handling complex network operations with the camera…

All the Plugin Projects are customized to have the same Build Output Path, to avoid doing Manual work:

Strategy05 Plugin Build properties

Just be Aware that the output path that you use must exist and be the same for all Plugins.

Main Application

So, all we have left is to implement the mechanism to recover the assemblies at a concrete filesystem path and load them dynamically into our current execution process. We will do this using Reflection.

I am creating a wrapper class for exposing the strategies implemented by our plugin assemblies.

This class is named StrategyPluginContainer and will expose the two custom Attributes and an instance of the Plugin (really it is an instance of the class that implements the Strategy Interface).

The two key reflection techniques used here are:

  1. Activator.CreateInstance(Type) – This creates an instance of the specified Type using the default constructor. Note this is not reflection but comes directly from System Namespace.
  2. Type.GetCustomAttributes(Attribute type, inherit) – this obtains from a type the value of a custom attribute.

Note: green is due to style suggestions from my VS installation to what I do not agree if I want clarity. Expression bodied properties or using ?? are good, reduce space but if somebody is not used to this Syntax readability and understandability are reduced..

Strategy06 Plugin Wrapper

Now we can implement the StrategyPluginLoader. This class responsability is to Keep a list of the Plugins that implement the Strategy and it does so by loading them from the Filesystem (could get them from a web service or other mean).

Basically it has a List of StrategyPluginContainer which we just created, exposed through a property.

And populates it by getting all the DLLs from a specific hard disk Folder and loading them with Reflection’s Assembly.LoadFrom(filename).

then we get the types contained on this Assembly and iterate through them to match them against the Strategy Interface. I also check that the two custom Attributes are supported and if everything matches, I create a StrategyPluginContainer instance of this concrete type.

As a final check, I verify if the Plugin is already on the plugin list for not repeating and if is existing I update it in a proper way.

Strategy07 Plugin loader

Last but not least, I use all this through a nice console application, I create the StrategyPluginLoader, I execute the command to load all the plugins and iterate through them invoking the only command in the interface which is implemented in separate, decupled assemblies and loaded dynamically at runtime, without no Knowledge or coupling of any Kind in the main application.

Strategy08 bringing it together

The full code can be found in GitHub here.

 

Happy coding!

 

 

Roadmap towards Microsoft Azure…

Sometime ago, about 1 and a half month I decided to focus in Microsoft Azure Technology and acquire expertise on it…

This is a bit what I have decided to do and how I am doing it.

To say, that I do not like taking chances and usually I overprepare… which is convenient given how TRICKY some of this exams are (at least to me..).

This is the current Exam & Certification roadmap:

Azure_Certifications_04_2019

I disagree a bit on the Architecture Path, the green one on the Picture, towards getting the “Azure Solutions Architect”. Even you should be able to “paint boxes and connect them”, to me a Software Architect is somebody that also knows very well what is inside of These boxes and how they do work.

So for me the Roadmap towards the Azure Solutions Architect has the AZ-203 before the AZ-300.

So, in short, my initial roadmap is:

  1. Get AZ-900 (Update: got it!).
  2. Get AZ-203.
  3. Get AZ-300.
  4. Get AZ-301.

I’d like to have some solid foundations so I focus on a good understanding of the basis so to me, AZ-900 is a must have. There are simply too many “Things” (services, types of services, concepts…) laying around… So having a clear ground basis is a must.

For the AZ-900 I have done:

(now I am pending to have some time hopefully this week to prepare and execute the exam which you can do online through here:

Update: The exam is done and passed, will post shortly some of my comments and thoughts on it..

For the AZ-203 I am halfway preparing and have done/plan to do the following:

  • A very nice course from Scott Duffy at Udemy here: https://www.udemy.com/70532-azure/ (done, as well as some of the recommended HOLs)
  • The Pluralsight Paths
    • Microsoft Azure for Developers” , 34h. (in Progress)
    • To highlight that their paths have a “role IQ”, an in portal exam system that helps to measure your Level and where to focus on.  This is what I got when started, just after Scott Duffy training and some “hands on”Azure dev IQ Pluralsight
    • Developing Solutions for Microsoft Azure (AZ-203)“. And yes, this totals to 59hours but probably it will be really worth watching.  (not started yet)
  • The official HOL (Hands On Lab) for AZ-203 from Microsoft itself! (recommended by Scott Duffy)
  • Support from some of the Microsoft Learn resources, but if you filter by azure developer there are way too many…   I found that this link helped me greatly to focus. here you can see the following picture-recommendation for learning roadmap:AZ-203 roadmap
  • Basically, in Microsoft Learn all the following learning paths. But I plan to do them just as a support if I consider I am not confident on the topic.
  • I am setting up some projects of my own to put some things together so I can glue them in a way that makes sense, but this has some work implications and thus, cannot share in full detail.  One of them is implementing a full REST API with Azure Functions and expose it through Azure API Management, to finally consume it from an Azure App (a web app).  Is still have decided if the data will be stored in a Cosmos DB or SQL Azure database… but for sure it will have AD authentication.
  • And, of course, some exam preparation to get hands on feeling and get to know some of the tricks and traps you might face 😉
  • If you have any tip or recommendation, just shoot in the comments or contact me directly, would be greatly appreciated. I know that some people just take the Scott course, some exam practice and get it but I want some more hands on experience on me before moving forward.

 

For the “Azure Solutions Architect” certification, I would like to have some real experience and practice, but for now I plan to do:

 

And that’s it! Any comment or tip would be very welcome 🙂