Jump to content

RoomAlive


Guest MAJ.Kaossilator=US=

Recommended Posts

Guest MAJ.Kaossilator=US=
Posted

 

Project used to be IllumiRoom.. looks like they didn't scrap it (I kinda thought they had). One comment on the video is pretty decent:

wow, a lot of haters. this isn't for consumer market guys! this is a developmental step forward to the tony stark augment reality work space.

 

I think it's a good point. Whether this specific tech goes anywhere is beside the point. It's actively exploring a more immersive, augmented work/gaming environment, which is pretty exciting. I don't think this will to turn into a new console, or that any games will use this actual device. But it's progress in a good direction.

 

This is one more step to bring us from this:

http://i.imgur.com/GOEwc6v.png?1

 

 

Through this (not RoomAlive):

https://www.youtube.com/watch?v=B9ioVceVlvI

http://i.imgur.com/duXAYMM.jpg?1

 

 

To this:

http://i.imgur.com/OWKcgBi.jpg?1

 

 

Thoughts?

Guest StubbornOx1
Posted

Sir, ever since seeing the movie Gamer... this has always been a dream.

 

I think it is truly fascinating that we have emerged immensely since the creation of technology.

 

More specifically the compaction of it all. From the earliest computer that ran throughout an entire building that could only do simple calculations to the literal computers we have for phones.

 

It is truly awesome. (Using the word 'awesome' definitively)

 

I really can't control my excitement around these things. It's what sparked my interest into Computer engineering.

 

Just my thoughts on the matter! Thanks for the post Sir!

 

If you could continue to post anything you find similar to this it would be greatly appreciated, and I am always willing to give feedback.

Guest RhinoTech
Posted

I don't feel big set-ups are the way to future augmented-reality programs, sir. I imagine the system would comprise of a main computer hub, external sensors/devices for input & output (including peripherals such as gun-models). This environment would be created by strategically placed sensors, which wouldn't even have to be in the same room as the computer. Then goggles or shade-type devices (similar to google glasses) would be used as the computer's output. Software would use the sensors to interpret the environment and map output accordingly.

 

The system would allow one to interact with elements on their room walls, even in the middle of the room like a holograph. It wouldn't be a true "Iron Man" interactive computer system, since to see it one would need the goggles/glasses to see it.

 

I haven't heard about a system like what I just described having surfaced just yet.

Guest RET.Maj.ShadowOp=US=
Posted
I think we should just be putting all the dollars into better neural interfacing. No need for expensive peripherals of any kind if I can just plug I/O into my brain.
Guest RhinoTech
Posted

Unfortunately sir, I have no background in biotechnology. I don't even know how close we are to achieving something like that. It would be beyond awesome if I could download media into an embedded biotech ssd and interface with it. Imagine having all that information in addition to google's search algorithm! Studying would be so easy, and standardized tests would have to change to reflect practical skills use rather than knowledge retention.

 

Actually it reminds me of a SyFy series called "Intelligence" where they do just that, except took it one step further by connecting him to the internet. I stopped watching after a few episodes because the plot was getting lame.

Guest Oxygen5
Posted
Here I agree with the Captain, no need for all this equipments that require a large room space and money. But also I wouldn't favor if the things turn out like "The Matrix" =D
Guest MAJ.Kaossilator=US=
Posted

I think a lot of the benefit comes from what specific application you're looking at. There won't be one, universal, catch-all technology that accomplishes everything for the entertainment industry, the military, medical research, commerce, etc.

 

Lexus is using the Oculus to allow potential buyers to test drive their new cars. Gamers already use Oculus to try and accomplish a more immersive environment. For those two applications, a more VR-driven, compact solution could be really good. You don't actually need to go anywhere to test drive a car, and for the most part when we're gaming we're in our own homes with limited space, and need a compact but fun solution. Money is usually a variable, and with lower cost come lower results. I doubt people will ever have their own holodeck in their homes, but a limited immersion tech could easily work.

 

For the military, something that's completely immersive makes sense. That's why movies like Inception spin off of the whole military simulator idea for their plot arcs. It's a reasonable step. The more immersive, the better for training purposes. More money, more results. It's completely logical that the military would want to go more the route of tricking the brain into thinking they are literally somewhere else with a totally different environment around them even though they're lying on a bed in a lab. The training implications are enormous, and training is such a massive part of what the military deals with every day that it's in their best interest.

 

For medical and other research, mapping the galaxy, teaching, etc, a more holodeck approach makes total sense. If you want to see a model of something, or if you want to interact with a holographic representation of an environment, location, or object then it is perfect. What better than practicing surgery on a holographic patient on a table in front of you? You don't need to trick your brain into being somewhere else, you can practice right there in front of you. In the classroom, if you could take a group of kids into a holodeck and show them what Jupiter looks like or what Earth looks like from space, then a holodeck makes perfect sense.

 

We're sort of nipping around the edges of some really cool things. It's all pretty legitimate stuff, and the applications will always be there, just with a different focus. Different intent, different product, different results. Realistically, we're much closer to the holodeck than we are The Matrix or Inception. But that doesn't mean it all works perfectly for the different applications.

Guest RET.Maj.ShadowOp=US=
Posted
Unfortunately sir, I have no background in biotechnology. I don't even know how close we are to achieving something like that. It would be beyond awesome if I could download media into an embedded biotech ssd and interface with it. Imagine having all that information in addition to google's search algorithm! Studying would be so easy, and standardized tests would have to change to reflect practical skills use rather than knowledge retention.

 

Actually it reminds me of a SyFy series called "Intelligence" where they do just that, except took it one step further by connecting him to the internet. I stopped watching after a few episodes because the plot was getting lame.

 

There is actually some pretty neat stuff being developed for this LCpl. We've already got working neural interfaces to control prosthesis and computer mic eand other input devices. We're making progress on mapping peoples mental images (dreams / thoughts etc). The only part that is difficult is the input into the brain. Its a lot easier to read what the brain is doing as opposed to forcing sensory imagery on it.

 

 

Also for those that haven't seen:

http://clanunknownsoldiers.com/barracks/showthread.php?25033-My-current-and-planned-VR-build&highlight=omni

 

Yeah I like my VR.

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...