You can access it using Win->G. The video gets saved in your C:\Users\YOURNAME\Videos directory.
Super Convenient!
As always, Google is your friend.
https://gizmodo.com/windows-10-is-hiding-a-great-video-capture-tool-1719196149
Win 10 has a video capture application built in. It's designed for capturing output from games, but it also works fine with YouTube.
You can access it using Win->G. The video gets saved in your C:\Users\YOURNAME\Videos directory. Super Convenient! As always, Google is your friend. https://gizmodo.com/windows-10-is-hiding-a-great-video-capture-tool-1719196149
0 Comments
I finally took the plunge and picked up a PC based VR system. Specifically I got the HP Mixed Reality Headset. I had some fun setting up but overall I'm impressed. The headset is fairly comfortable, the cable is certainly lighter and less noticeable than on other headsets I've used. It works well in my dark apartment without the need for tracking stations. The range of tracking on the hand controllers is fine, I've not noticed them go out of tracking yet. The MR home/application launcher is reminiscent of PlayStation Home. You can access the store as well as web browser, skype and video playback from screens that can be deployed around the location. And yes you can stand on the counter tops ( just like PlayStation Home). So apart from the shenanigans detailed below, I'm impressed so far. Now to make some content! :-) More Cables NeededThe headset is supplied with a cable that ends in HDMI and USB 3.0 plugs in a Y configuration. Each branch of the Y is about 8 inches long, the expectation is your PC will have a USB3.0 SS port and an HDMI port close together. This may be true for some laptops but on a desktop, often the fast SS ports are on the front where they can be used to connect to external drives, cameras etc. I needed to use a USB3.0 extension cable to route the cables in a configuration that works. Using a standard USB extension cable or a Targus USB3.0 hub did not work, the headset would try to start and fail repeatedly although tracking was working. You can get the correct cable anywhere, just make sure it's 3.0 I got mine from Amazon Basics for 7 bucks. If you have a laptop, you may need a Display Port to HDMI adapter as well. There is a list of supported adapters on Microsoft's website. Controller PairingThere is a plastic tab inside the controller battery compartments that is used to pair the device. This is a bit fragile. I managed to permanently bend one of mine in the down position which meant the button underneath was depressed. After ruining a nice pair of tweezers trying to fix this, I broke the plastic tab off with a screwdriver and then was able to pair the controller. It is worth noting that the controllers need AA batteries and none are supplied. I'm using rechargeable ones with no problems. WiFi Vs BlueToothThe last issue I'm having is the Bluetooth controllers swamping my PC's WiFi connection. Apparently this happens on some laptops but also happens on my aging desktop. The solution seems to be a dedicated bluetooth dongle that doesn't share antenna with with the WiFi. Apparently switching to 5Ghz WiFi can also solve the problem but neither of my access points support that. Sheeeshhhh. My short term solution is a wired internet connection, but I guess I'll get a dedicated BT dongle.
Your notebook may share Wifi antenna with Bluetooth when connected to 2.4GHz access point. Check from device manager if you can switch band preference to 5GHz. If 5GHz network is not available and performance is severely impacts consider using Bluetooth dongle. I wanted to experiment with location based augmented reality and I wanted to do it in the easiest possible way. Metaverse from gometa.io is a service that makes it easy to create AR applications. Content authoring is done through a web portal and the AR content is consumed through Android and IOS apps. I've always been fascinated by the early history of San Francisco and especially the large number of ships buried under the financial district in what was Yerba Buena Cove. I decided to combine the two and build an AR experience that allows the user to find the location of these buried ships. Fortunately there is plenty of information available on the web. Building ItMetaverse has the concept of Quests. A Quest is composed of a set of experiences. Each experience is tied to a physical location and is made of a number of scenes. The system has some rich functionality allowing for scavenger hunts and the like, but I all i needed was a way to hold a set of geo-located information panels. There are a variety of Scene types, the most basic being an image and some text, but you can also play video, go to web pages, as well as giving and receiving items for gameplay. I decided that each of my Experiences would have two Scenes, one showing the name of the ship along with an image, and one giving more information using a simple webpage. I authored and hosted the webpages using Weebly. I tested the app using spurious scenes in my neighborhood before deploying the content to SF making it easier to test. In the above image you can see the list of Experiences in the Quest on the left, with The Rome Experience currently being edited. The first Scene appears when the user geo-locates the target. When they tap the button at the bottom of the page they are taken to the second page which displays the mini webpage. The experience then ends but stays visible on the map, but its possible to have the experience disappear or trigger another event. What the User SeesIn SummaryCreating the Ships buried under San Francisco quest took me about a day and a half. I then spent a very pleasant afternoon in the City testing the app and finding the ships. Apps like this definitely show the potential for guided/curated exploration experiences as well as games, scavenger hunts and promotional activities. I had a lot of fun making the app, I think anyone with minimal computer skills could do the same Within a large city with tourism like San Francisco the applications are almost limitless. How about a tour of the cities best street art, or a historic trip down the summer of love featuring the music and psycadelic rock posters of 1967?
Amazing video showing how the level creation was done for this game using Houdini.
I'd been meaning to watch this video on using Photogrammetry to create the landscapes and levels in Battlefront for ages. Finally made the time and its really worth it. Amazing work. Lar's and the guys from capturing reality gave a great talk this year at GDC. I want to go do some photogrammetry now :-)
I made a mod for NVIDIA's VR Funhouse game that adds giant spiders. Apparently I'm an Unreal Engine developer now.
Needed some horror music for the Spider Attack. Cranked this out in Acid Studio in about 10 minutes. I guess downloading all those 8packs over the years was worth it after all. Here is a screen shot of the Acid loop creation tool I used to make the track.
Learning Unreal Engine. Attaching blueprints to the spiders to control their behavior. This will be for a VR minigame. You are trapped in an ruined apartment complex being attacked by mutant spiders. So typical Thursday night really.
I'm heading to an AR/VR meet up at Google on Monday and I thought I should bring a demo. I dug out my Tango, updated the firmware, found a working USB cable (the hardest part of any Android project) and created in AR Unity Application where goldfish shoal around you.
Building the app was very straight forward. In typical Unity style, I was mainly dragging and dropping prefabs. Google's documentation is up to date and very helpful. Because the Tango is a bit more aware of its environment than a regular smartphone the fish feel more planted in the environment. You move the device from side to side and back and forth and the fish still appear in sync with the camera feed. It's pretty cool. There are lots of things that could be improved, like matching the lighting on the fish to the information from the camera as well as occlusion mapping. Originally I was going to put the Tango in a VR headset so this can be experienced that way, but I think for the meetup a handheld experience will be more fun and social. Hmmm... I wonder what the market for augmented fish tanks is like? That Pokeman thing sure seems popular :-0 There are a couple of wrinkles to rending lines in Unity depending on what you are trying to do. Line rendering is achieved with the LineRenderer. A Game Object can only have a single LineRenderer attached so you need to create a new GameObject for each LineRenderer. If you need to dynamically create and destroy lines you'll need to keep track of them in a list.
//have to add a new game object per line renderer LineRenderer lineRenderer = new GameObject().AddComponent<LineRenderer>(); The other issue is that if your lines are using a shader that isn't referenced anywhere else in your scene, the Unity Editor won't include them in your package and the lines won't render in a build project although they will work in the editor. // force the shader to be copied into the resource folder using // Edit->ProjectSettings->Graphics->Always Included Shaders Material whiteDiffuseMat = new Material(Shader.Find("Unlit/Color")); lineRenderer.material = whiteDiffuseMat; Not completely horrible but still, hoops to be jumped through... using UnityEngine; using System.Collections; public class basicRender : MonoBehaviour { void RenderCircle(Vector3 pos, float rad, Color col) { float angle = 0.0f; int segments = 128; //have to add a new game object per line renderer LineRenderer lineRenderer = new GameObject().AddComponent<LineRenderer>(); // force the shader to be copied into the resource folder using Edit->ProjectSettings->Graphics->Always Included Shaders Material whiteDiffuseMat = new Material(Shader.Find("Unlit/Color")); lineRenderer.material = whiteDiffuseMat; lineRenderer.SetColors(new Color(1.0f,1.0f,1.0f,1.0f), new Color(1.0f,1.0f,1.0f,1.0f)); lineRenderer.SetWidth(0.02F, 0.02F); lineRenderer.SetVertexCount(1+segments); for (int i = 0; i < (segments + 1); i++) { Vector3 p = new Vector3(); p.x = pos.x + (Mathf.Sin (Mathf.Deg2Rad * angle)*rad); p.y = pos.y + (Mathf.Cos (Mathf.Deg2Rad * angle)*rad); p.z = pos.z; lineRenderer.SetPosition (i,p); angle += (360.0f / segments); } } // Use this for initialization void Start () { float r = 0.6f; for (int i = 0; i<20; i++) { RenderCircle(new Vector3(0.0f,0.0f,0.0f),r,new Color(1.0f,1.0f,1.0f,1.0f)); r+=0.6f; } } // Update is called once per frame void Update () { if (Input.GetKeyDown(KeyCode.Escape)) Application.Quit(); } private void OnDestroy() { Destroy(this.GetComponent<Renderer>().material); } } |
David CoombesMaking stuff Categories
All
Archives
February 2024
|