r/AR/VR/Future for Kotonog
11 subscribers
3.9K photos
2.19K videos
12K links
All hot topics about AR/VR/Future for Kotonog
Parsing from channels:
- r/augmentedreality/
- r/oculus/
- r/virtualreality/
- r/AR_MR_XR/
- r/Futurology/
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
As questionable as its depiction of VR is, I always thought the Ready Player One movies idea of 'toy vehicle inventories' would be a neat VR mechanic.

https://redd.it/1rob1ev
@arkotonog
I tested XGIMI's MemoMind One and they prove smart glasses don't need a camera to be good
https://redd.it/1rol4w7
@arkotonog
In 2025, for the first time, solar and wind produced more electricity than fossil fuels in the European Union. The bloc's goal to reduce fossil fuel use by 90% by 2040 seems on track.

The 2026 Middle East War is likely to be the last in human history where a disruption to fossil fuels means a major global economic impact. By the 2030s, both China and Europe will be well on their way to totally decarbonising their economies, and Chinese manufacturing exports of renewable tech will be doing the same for much of the rest of the world. The age of fossil fuels will be disappearing in the rear-view mirror.

The longer the war goes on, the more renewables win. It will be clear they mean cheap, reliable, clean, and freedom from global instability. Tens of millions of people around the world who have cars to buy in 2026 will be looking at EVs with new appreciation.

DATA/ARTICLE - In 2025, solar and wind produced more electricity than fossil fuels in the European Union

https://redd.it/1rp07cm
@arkotonog
What will seem like an inevitable outcome in 20 years time because of GLP-1s

I'm kind of obsessed with the wide range of impacts GLP-1s is having on peoples day to day life and the wider impacts on the food system/social behaviours/family dynamics ect.

A few examples:
1. My friend has completely stopped drinking (even post coming off) and primarily socialises now through sauna/runs/hiking ect
2. Another friend is very tired so has massively reduced their socialising and also their consumption of literally everything. She says she does a lot more chill hobbies at home on her own.
3. The often quoted stat that it is going to save airlines $580mil a year on fuel.

If we assume there will be mass uptake of GLP-1s: what do you think the inevitable societal impacts of this are? What impacts that are non obvious now do you think it will have?


One of my short term thoughts is an increase in nutritional deficiencies that require treating, and therefore increased pressure on the food system to overhaul (here's hoping).

https://redd.it/1rqvadj
@arkotonog
We're going to look back at the current internet the way we look back at cigarette ads from the 1950s

Every app is designed to maximize time spent, not value delivered. Social media algorithms feed you content that makes you angry because anger drives engagement. Kids have unrestricted access to platforms that adults struggle to use responsibly.

In 20-30 years, I think we'll look at this era of unregulated attention harvesting the same way we look at doctors recommending cigarettes. The science was already there, the harm was already visible, but the money was too good for anyone to stop.

The only question is how much damage gets done before the correction happens.

https://redd.it/1rrlg33
@arkotonog
"The frame is the most comfortable and lightweight VR headset I’ve ever worn." - Ben Smith from indie.io
https://redd.it/1rrv6t5
@arkotonog
Q3 Is Still One of the Most Reliable VR Headsets in 2026
https://redd.it/1rysud5
@arkotonog
As a quarter of the globe's fossil fuel supply faces going offline for years, America is bringing the Fossil Fuel Age to a crashing end.

In the summer of 1914, enthusiasts for war were sure they'd be home by Christmas. The ancient ruling dynasties of Romanovs, Hohenzollerns, and the 700-year-old Habsburgs felt their thrones were safe. Five years later, that was all proved very wrong.

Five years is about the time span it may take to get the Persian Gulf's oil & LNG back online if it is destroyed. Iran & Israel/US are quickly nearing the point on the escalation ladder where that may happen. “History doesn't repeat itself, but it often rhymes,” as the famous aphorism goes. This time, the casualty may be fossil fuels themselves.

As this war progresses, the world may soon find itself in a far bigger emergency situation than COVID. Rationing and economic chaos lie ahead. Like COVID, governments will scramble for alternatives and responses.

But something is different this time. There is an alternative. It's a world dominated by renewables and electrification - not fossil fuels. We were already transitioning to it anyway. Now, war may force people's hands and make this future happen far quicker. In 1918, no one wanted the old world back ; they wanted a new one. We may find the same is true for fossil fuels when the latest ME war is finally done.

https://redd.it/1rz6o4i
@arkotonog
VRPerfKit: Use NIS at 200% (renderScale 2.0) to IMPROVE image quality — not just performance. Works on SteamVR, Oculus AND OpenXR games (MSFS, etc.)

I've been using VRPerfKit in a way I've never seen documented anywhere, and I want to share it because the quality improvement is significant with almost no performance cost.

The short version: set renderScale to 2.0, method to NIS, and sharpness to 0. That's it.

# What this actually does

Most guides assume VRPerfKit is only for reducing render resolution to gain performance. But you can do the exact opposite: keep your game's native render resolution and let NIS reconstruct a 2x image (4x the pixels) on top of it, applied only to the central 60% radius where your eye actually focuses.

The key insight is that integer scaling (exactly 2x = 200%) is mathematically optimal for spatial upscalers. Every source pixel maps cleanly to a 2×2 block of destination pixels. There are no fractional phase mismatches, no ambiguous sample points. NIS's 6-tap adaptive kernel has full, symmetric information for every single output pixel. This is the one case where a spatial filter works at its absolute best.

# Why sharpness must be 0

At fractional scales (0.77x, 0.83x, etc.), sharpening is added to compensate for blur and reconstruction artifacts. At 2.0x integer scale, there is no information loss to compensate for. Adding sharpness would only introduce noise and halo artifacts on top of an already clean reconstruction. Set it to zero.

# Recommended config

upscaling:
enabled: true
method: nis # NIS — best choice, explained below
renderScale: 2.0 # integer 2x scale — the key
sharpness: 0.0 # zero — no compensation needed
radius: 0.6 # central 60% — where your eye focuses

# OpenXR compatibility (MSFS and other non-SteamVR games)

The official VRPerfKit documentation says it targets SteamVR and Oculus runtimes. However, the upscaling component (NIS/FSR/CAS) also works with OpenXR games — the fixed foveated rendering part does not. This means it works fine with:

Microsoft Flight Simulator (OpenXR)
Half-Life: Alyx (SteamVR)
Asgard's Wrath (Oculus)
The vast majority of PC VR titles

I use it combined with DLSS 4.5 transformer in MSFS and it still adds a noticeable layer of quality on top.

# Why NIS and not FSR 1 or CAS?

NIS vs FSR 1: Both are spatial upscalers, but NIS uses a wider 6-tap kernel with 4 directional edge filters (horizontal, vertical, ±45° diagonal). FSR 1 uses a modified Lanczos-2 with only a 2×2 source texel window. At 2.0x scale, NIS has significantly more contextual information per output pixel. NIS wins on reconstruction quality.

NIS vs CAS: CAS is primarily a sharpening filter, not a reconstruction upscaler. It adjusts per-pixel contrast to recover sharpness lost by TAA or post-processing. It can do minor upscaling as a secondary function, but at 2.0x it has no proper reconstruction kernel. It is not the right tool for this use case.

NIS is the correct choice: better edge preservation, more reconstruction context, designed for quality upscaling.

# What about DLSS?

DLSS 4 (especially the transformer model) is in a completely different league — temporal reconstruction with motion vectors, multi-frame context, AI-driven detail recovery. If your game supports it natively, use it. But the vast majority of VR games do not have native DLSS support, and DLSS cannot be injected externally (it requires the game engine's motion vectors and depth buffer). NIS at 200% is the best universal alternative available today.

If you care about VR image quality, lobby developers to implement DLSS natively in their VR games. The difference in games that have it (MSFS, No Man's Sky, Kayak VR) is substantial.

# Quick summary

renderScale 2.0 = integer scale, no artifacts, no blur
sharpness 0.0 = correct, no noise amplification
NIS = best spatial filter for this use case
Works on SteamVR, Oculus, AND OpenXR
Negligible performance cost
Universal: works on any VR game
regardless of its AA method

Happy to answer questions. I'm curious whether anyone else has tried this — I've found zero documentation of this approach anywhere online.


\----------------------------------------------------------------------------------------------------

https://preview.redd.it/ixvqcuovrcqg1.png?width=1410&format=png&auto=webp&s=5bff30572330ec36e84477fc5a0930cb8f37041a

I want to clarify exactly what's happening at the pixel level, because I think the confusion is about what renderScale 2.0 actually does.

Concrete example with real numbers:

Say the game is rendering at 2000×2000 per eye (whatever your SteamVR resolution happens to be — I use Virtual Desktop at medium). With renderScale 2.0, the game still renders at 2000×2000. NIS then takes that frame and outputs 4000×4000 — but only within the central 60% radius. The game's render resolution is untouched. You are not sacrificing any rendering quality.

Why integer 2× is fundamentally different from fractional scales like 77%:

At 77%, the game renders at \~1540×1540 and NIS upscales to 2000×2000. The problem is that each output pixel sits at a fractional position relative to the source pixels — there's no clean 1:1 mapping. Every output pixel has a slightly different phase offset, so the 6-tap reconstruction kernel can't be symmetric for all of them. The result is a mix of mild blur and per-pixel inconsistency that shows up as instability when you move your head. People compensate with sharpening, which recovers some perceived sharpness but adds noise on top.

At 2.0×, every single source pixel maps to an exact 2×2 block of destination pixels. The phase is identical for every output pixel. The 6-tap kernel has full, symmetric context. NIS is operating in its mathematically ideal condition — the one where its edge-adaptive directional filters (it runs 4 directional passes: horizontal, vertical, ±45°) can actually do what they're designed to do without ambiguity.

What the reconstruction actually produces:

Going from 2000×2000 to 4000×4000 via NIS is not just "making pixels bigger." The kernel is analyzing gradients in the source image and inferring where edges lie at sub-pixel precision. The 6 neighboring samples per axis give it enough information to reconstruct diagonal edges, curves, and fine texture in a way that bilinear or bicubic cannot. The perceived sharpness gain is not artificial — it comes from real edge reconstruction information that exists in the source data but was never being used by the display. Setting sharpness to 0 is correct here: there's no blur to compensate for, so you don't want the sharpening pass adding noise to an already-clean signal.

The 60% radius constraint means this processing covers exactly the foveal region — where your eye actually has full angular resolution. The lens distortion makes the periphery blurry anyway, so spending GPU time reconstructing it would be wasted.

This is a completely different operation from the standard use case. NIS at 77% is trying to recover quality lost by intentional downsampling. NIS at 200% is adding a reconstruction layer on top of a full-quality render. The direction is opposite, and the math is optimal in one case and compromised in the other.

Sharpness 0, renderScale 2.0, method NIS. It's not placebo — disable it and re-enable it and the difference is immediate.

https://redd.it/1rz6xki
@arkotonog
Meta bricked my Quest 3 with an update, told me "all good things must come to an end", then added a new headset to my shopping cart

My Quest 3 was recently bricked by a Horizon OS update. I contacted Meta support, explained the issue, cited the Consumer Rights Act, and asked for a replacement since their own software update killed my device.

After their first line support stonewalled me and initially refused to direct me to their complaints process, I finally sent a full account of what had happened and received one of the most baffling emails I've ever read:

>You can not possibly imagine that we will keep replacing the same device over and over despite what the issue with the device is, weather it was software or hardware related.

>As you mentioned your device has served you well for over 26 month which is actually beyond it's standard warranty, as you know all good things must come to an end I'm afraid.

>Please note that opening multiple tickets for the same concern will not change the outcome, we will not be able to replace your headset. I think it is time to have an upgrade.

>I've added the Meta Quest 3 512GB to your bag.

They have never replaced my device before. I have never opened a complaint with Meta Support before. The Quest 3 they added to my bag wasn't even a refurb.

https://redd.it/1rzz761
@arkotonog