Sunday, September 19, 2021

Playing Snailiad on Debian Linux in 2021

Flash is dead. It's still a bummer that Adobe hasn't just released the source code for public maintenance.
There are still some flash games around which should be preserved for future generations. Snailiad is one of them.

I've joined the official discord server of Snailiad to check for the currently accepted way of playing it. It turns out that newer versions of Flash are playing Snailiad at a wrong speed and the recommended way is using Flash 11.3 for it.

The old Flash binaries can be grabbed on
Disclaimer: Download this program at your own risk. It's an old preserved linux binary and I can't assure its authenticity!
Please check the md5sum before installing: 11b83aafdd4de9b64590b2ed43c6cb08

As this is a 32 bit application, some libs need to be installed in their 32 bit version.
sudo apt-get install libxt6:i386 libgtk2.0-0:i386 libasound2-plugins:i386
Then you should be fine with:
./flashplayer Snailiad.swf
Happy Snailing around!

Tuesday, March 16, 2021

100% disk usage on SSD and system freezes on Windows 10

I'm no ordinary Windows user. I boot the system, I play some Doom Eternal, play some VR and then I'm rebooting to Linux.
When I bought my current PC last year in March I haven't noticed it really but everytime I install something, my computer freezed. Not even Discord is reacting on keystrokes. It's weird and only occured under Windows.

At the moment I'm booting Windows more often as I'm playing around with development for VR and my headset is not supported on Linux. (Not a lot are anyway...)

And it keeped getting onto my nerves so I investigated with procexp64.exe and the task manager. The disk usage of my system was pretty much always at 100% during the freezes. But why is the I/O that high If no actual data is requested?


Then I finally found the solution here. It turned out that the Microsoft driver StorAHCI.sys has an issue with some SATA controllers if MSI is enabled. It can be disabled with regedit.

If you suffer from the same issue, maybe the provided link will also help you. I'm writing these lines to increase the odds that the next stranger will find this faster than me.

Thursday, January 28, 2021

How to play Half Life: Alyx with an HP Reverb G2 and an RTX 2070 Super - My story

Why all this?

It feels kinda awkward to write this. I mean... duh... you just buy an VR Headset, buy the game and play it. It runs on Windows so how difficult can it be?

I had no idea and this is my story.

Until 2020 I didn't had lots of experience with VR. Back when the Oculus Rift DK2 was available I borrowed it from a friend and played some Minecrift (The spelling is correct) with it. It was a mod to make the Java Edition usable with this particular VR headset. It was the early age when Motion controlls were not availabe. It was a horrible experience. I felt sick after it and the view inside was like looking through a screen door as the pixels had some distance in between. I also played a Half Life 2 mod but it looked worse than Minecraft as all textures were washed out. The resolution of the headset was just not high enough.

Then it went quiet for me concerning VR. Someday a friend showed me his PSVR for his PS4. I didn't like it as well. The resolution was again not high enough. The games were boring and uninspired. At least that's what I thought at that time.

In August 2020 I've visited a friend during a vacation and he showed me his Oculus Quest. The time has passed and so did the technology. Motion Controls are now the norm and more creative games were available. For the first time I've played Beat Saber and I loved it. We also played "Keep talking and nobody explodes" together and as we both are quite Retro when it comes to gaming, he also showed me the VirtualBoy emulator VirtualBoyGo. It was a nice experience and I wanted more. His brother owned the Valve Index and was so kind to let me compare it to the Quest. I could play Beat Saber as well. At the time I was overwhelmed so I didn't noticed much resolution difference. Everthing was new: Inside Out Tracking? Lighthouses? Knuckles with Fingertracking? Guardians?

At that day I knew that I wanted this as well. VR has matured into a state I believed I could now have fun with. And so it started.

What Headset to get?

During the time I've made the decision to get into VR, there were quite some options for headsets to buy. I wanted to go for PC VR as I just bought a new PC and wanted to have the best graphics for my VR experience. Also I really wanted to play Half Life: Alyx which was already known as one of the first AAA VR games. So it was either the Valve Index, or the just announced HP Reverb G2, which was advertised as having a higher resolution and therefore being less prone to the - so called - screen door effect. Some YouTubers already got hold of some early samples and made in-lense videos to show the resolution differences between Oculus Rift S, Valve Index and the HP Reverb G2. I was hooked onto the G2, even so this meant that I had to wait some weeks until I can get my hands on one of them. But that was ok. No need to rush. I've already missed like 4 years of VR and some weeks wouldn't hurt.

Waiting times for the Headset to arrive...

November came, no G2.... issues with the shipment.

December came, no G2.... a little bit of frustration.

On the 21.12.2020, just before Christmas, the package arrived. So by accident I've just made myself a present for the holiday season. Luckily, one day after that I even got my lenses from VROptiker. I was ready!

I even started Windows after half a year as it was booted to play Doom Eternal when it came out.
Yes, I'm a Linux user....

So, I plugged it in and after some Windows updates and installation routines of the Windows Mixed Reality portal I got this:

Error Code 7-14, indicating an USB 3.0 issue

Huh. Whats, this? Omg. Please don't tell me that the Headset is broken. That would be awful.

I searched about this 7-14 error code on the internet and it turned out that a lot of people have this issue. It was related to bad USB 3.0 ports on AM4 socket AMD boards. With my B450 I'm also into the party. What could I do? Would my VR experience end here? Just days before christmas?

At that time I've stumbled onto quite some websites. I've found the german website vr-legion, which gave some tips and tricks about this issue. I gave it a shot and bought the USB 3.0 Vivanco USB Hub 36663 at a local electronics store. It was Corona time but I was lucky that I could order online and grab it at the store.

Back home I was full of hope. I've connected the USB hub and there we go. The headset worked and I was standing at the Cliffhouse of Windows Mixed Reality. I even was able to play some Beat Saber.

But something was off. The tracking stuttered and sometimes the Audio was gone and came back. I could hear from my monitor that the USB plug-in and plug-out sounds were played. Sometimes the view in my headset stopped and locked me in place, even so I was moving in reality. Sometimes the view rotated 180° and I could see like I had eyes on my back. It was an awful experience.

I've then ordered the vr-legion suggested USB 3.0 PCIe card from Inateck. But it had the same issue as the Hub. A friend and neighbor was so kind and bought a second hand Inateck card one day before christmas so I won't be so sad about it. It was an older Inateck card which I couldn't find online any more. But it had the same issues. Both Inateck cards had in common that they used the Fresco chipsets for USB3.0 implementation. They don't seem to be so realiable for VR.

Was the headset really broken then? I've searched online and found the german Youtuber Zitronenarzt VR wo had similar issues as well. After Christmas was over, I didn't really knew what to do. My neighbor suggested to get a PCIe USB 3.0 card from CSL which is supposed to use an NEC chipset. This chipset was one of the two which were also suggested by Zitronenarzt. I got it again second hand from a former Oculus Rift player, a headset which shares these USB issues as well. And? It worked! The audio dropouts were gone and also the full tracking losses were a thing of the past.

One note though. It has a Renesas chipset. Not NEC. But it works as well. 

CSL PCIe USB 3.0 Card with a Renesas Chipset, which was more reliable

Tracking Stutter issues, glitches in the flashlight

So what to do now? Playing some Beat Saber was the best option. And also buying and downloading Half Life: Alyx. During that time, a christmas steam sale was still active and I also went for Superhot VR and Moss, so I had some VR titles to play.

Beat Saber worked quite well. Quite some fast paced action. I've played the Campaign to get into it. Parallel to that I've also started to play Half Life: Alyx (now shortened to HL:A) as I really loved Half Life 2 and thought this must be the best experience to get with current VR technology.

The game itself worked kinda sorta. But something was still off. I could see it in the Cliff house. I could see it in the Menu of Beat Saber. And I could see it in HL:A all the time while standing still. My head tracking had stutter issues. It was very subtle but it was always there. It was almost like the tracking was one frame behind my real movement. It's difficult to describe.

I've searched online about this but couldn't find answers. I've contacted the HP customer support but they weren't able to reproduce the issue. At that time I've also experienced that the image shown by my flashlights (Microsoft's name for looking through the headsets inside-out tracking cameras) had some pixel issues now and then. I really thought that this was the culprit of the stutter and told the support that. But they never answered this one question.

At that time I've thought that the situation improves when pressing the headset against my head. But boy, was I wrong. That only reduced the symptoms.

Windows 20H2

As you might know by now, I'm a full blooded linux user. I only boot Windows when I really need it. My current Windows 10 was build 1909. I always wondered why the settings menu of the Mixed Reality portal looked different on pictures I've found online. I also didn't knew that an update to 20H2 must be manually activated. Previously I just grabbed the normal updates and thought this was enough. And so I updated to 20H2.

Did it fix the issue? I don't know anymore. Maybe it did and you should update as well.

The Nvidia driver 446.14

I don't know how I got this information in the first place, but I've notived an ongoing discussion about Nvidia drivers and VR compatiblity. It was on various forums, reddit and also the offical support forum of Nvidia. With a certain Nvidia driver, frame drops seem to occur more likely in VR.

As a user of a RTX 2070 I was still able to downgrade. And so I did and it improved my experience. The Beat Saber menu no longer stuttered and also standing still I seem to have no issues any more.

I've played a few days and enjoyed HL:A while still being contact with the HP support.

Half Life: Alyx - Chapter 5: Northern Star

The first chapters were a blast to play. I liked the concept to have an option on how I would like to move in the game. As a VR newcomer I went for the "blink teleport": I point somewhere and by letting go the left stick, I teleport to that location. It was the same concept which is also used in the Mixed Reality Cliffhouse and SteamVR home as it reduces motion sickness. Everything was fine until chapter 5 of the game and then it went down again.

Everytime I used the teleport, the game dropped 8 frames. Extreme motion stutter was the result. So I've contacted the Steam support about this issue. Sadly they couldn't help me as this issue seems to be unknown to them.

The Nvidia hotfix driver 461.33

At this time I gave up on HL:A for a moment and started to play Superhot VR. The game worked without any flaws (apart from the stuttery menu room) and again was blast to play.
I've also observed the support forum of Nvidia about the stutter issue and suddenly it happened: On 20.01.2021 a hotfix was released. It didn't really took off for me at first. Beat Saber started to stutter and also HL:A stuttered extremly. But the 8 frame drops were gone and replaced by exactly 1 which was then filtered out by the motion reprojection of Windows Mixed Reality.

Tweaking Half Life: Alyx Launch Options

I've experimented a bit and also searched further about performance issues with HL:A. One of the sources which helped me with this issue was this website. It helped me to understand the dynamic scaling system of HL:A which was very important for the next part of the fix. I can also recommed reading this article about the differend fidelity levels and debug mode for better investigation.

With this knowledge I've applied these launch options:

-console -vconsole -novid -nowindow +vr_perf_hud 1

-console -vconsole allows to open the console by pressing ~ on an english keyboard layout. Might come in handy (It doesn't work with a germany keyboard layout)

-novid removes the Valve intro. For faster testing of settings.

-nowindow removes the spectator mirror window and increases performance.

+vr_perf_hud 1 activates a visualization of the dynamic scaling head room analysis. (say that 3 times fast)

You can read more about this visualization here. While doing so, I've noticed that my HL:A instance only has 4 levels of details to choose from while usually there should be 8. I was confused. But nontheless I've found a kinda sorta working solution.

I've decreased the application specific super sampling to about 50% which is then multiplicated with 76% which is my automatic super sampling setting of SteamVR and there I had a kinda sorta working solution. But the game didn't really look that sharp anymore. So something was still wrong. In the game itself you are capable of configuring the visual quality: Low, Medium, High, Ultra. The settings didn't had any effect on my performance though.

The last Fix - Don't use Vulcan

I've continued to play the game went for the later chapters. It didn't look really sharp but I still had my fun with it. And a few hours away from the end It kinda hit me.... Vulcan.... Direct X11.....

I remembered that at the start I went for Vulcan instead of DX11, which is the default of HL:A. I switched again to DX11 and.... wow. The 4 levels of detail again turned into 8 and the game looked much sharper. I've continued to play the end of the game and really had a good time with my settings.

I only felt a little bit sad, as it would have been better if I had known all this from the start. In the end I've resolved my issues and hopefully finally can enjoy my VR experiences.

TLDR or Conclusion

So you have performance issues with Half Life: Alyx? Try the following :

  • Apply launch options -console -vconsole -novid -nowindow
  • Reduce Application specific super sampling. Keep in mind that Half Life: Alyx is capable of a scaling factor bigger than 100% when possible. The Steam super sampling is only a reference for the game. It only defines the internal 100% as it seems.
  • Try Vulcan and DX11 and compare both performance wise.
  • Try the new hotfix driver from Nvidia with the version 461.33
  • Deactivate any GPU monitoring (exception is SteamVR performance graph)
  • Deactivate Stopsign VR, fpsVR or any Steam Overlay
  • Decrease Steam Overlay rendering quality to Low (if you really need overlays)
  • You won't see a difference and it performs better.
  • Optional: Deactivate Windows Mixed Reality Motion Reprojection for HL:A
  • Set from "Auto" to "Per Application"
  • Disable Motion Smoothing in Video settings for Application.
  • Launch the game with +vr_perf_hud 1 and observe the selected level of detail.
    • Alter the application specific super sampling according to the headroom.
  • Check the SteamVR performance graph
    • Pink spikes must be avoided on all costs as they are frame drops and cause stutter.
    • Orange should not occur often. It's more ok if Motion Reprojection / Smoothing is enabled.
Something I can't confirm. But maybe it works for you:
  • Deactivate RGB tools.
  • Discord can be a problem (I don't see this)


Sunday, September 20, 2020

The irrational dream of having CRT monitors in 2020

Backstory – In March 2020, Doom Eternal was released to the public.

So what does that have to do with CRT monitors? Let's find out.


Back in 2017, I played Doom 2016 and was hooked. It was a great game and felt almost like some sort of UT2004 single player experience because of the fast action and pacing.

When ID Software announced a followup, I was again hyped and looking forward to the release.

During March, it was more and more apparent that my current setup with a GTX 960 and an i5-2500K would not be able to handle the game. The system in question used to be state of the art in 2012, and I didn’t really need a new PC for a long time. All my previous upgrades were triggered by game releases. In 2004, I upgraded to play Half-Life 2. In 2012, I upgraded again to play Starcraft II. So you can say, those games were system sellers for me.

In April 2020, I upgraded to a Ryzen 7 3700X and an RTX 2070 Super to finally play more recent games. But what else do you need for a good gaming experience? A high-quality gaming monitor, of course! The past seven years, I used a 1200p Dell office monitor for gaming. The picture quality for static images was good, the text readable. But for dynamic images, this monitor had quite an awful amount of motion blur.

I was never really an early adopter, and I only ever upgraded my hardware when necessary. In the meantime, gaming monitors had changed. 60 Hz is no longer state of the art, and the need for a 1200p resolution for more lines of text on the screen was no longer a requirement as 1440p and 4K are now mature technologies and I would just need to buy a larger monitor to get the same results.

I ended up buying the ASUS TUF Gaming VG27AQ on Amazon, as the current COVID-19 epidemic didn’t allow me to buy electronics in local stores. I was especially interested in the ELMB-Sync (Extreme Low Motion Blur) feature, as I had hoped that backlight strobing might finally result in a CRT-like experience on TFT monitors. You know, no motion blur?

ELMB-Sync is a specific feature you find only on ASUS monitors (by the time of writing this). It is a Variable Refresh Rate technology coupled with Backlight Strobing. Compared to G-Sync monitors, which can only do either G-Sync or ULMB (Ultra Low Motion Blur), this was advertised as being a step forward.

But it turned out that this monitor sucked at it. I had issues with strobe cross talk for every refresh rate I could think of. I was quite unhappy with the device, and returned it shortly after only 3 days of testing and usage.

A photo of the strobe cross talk test from on the VG27AQ

As a “videophile”, it’s pretty difficult to buy display devices. Most technologies have advantages and disadvantages to them, and it seems to be impossible to find something which is the best of all aspects. Even so, making the decision to buy the VG27AQ based on reading various reviews seemed to be inadequate. A friend of mine told me not to rely on alone and suggested reading articles on sites like as well. So, the search was still on.

It was on May 16th 2020 when I decided to buy the “least crappy monitor”, as a “best one” was not available, and ended up with the ASUS ROG Swift PG279QE, which is still my monitor as of writing this article. Compared to the ELMB-Sync of the VG27AQ, the ULMB of this monitor was significantly better. The strobe cross talk was weaker, but was still there. The biggest problem was the restriction of refresh rates, as a lot of fantastic games were still limited to 60 Hz. Some examples are Factorio (released in 2020), Freedom Planet, Carrion (also released in 2020) and most emulators for older systems. ULMB could be hacked to allow 60 Hz backlight strobing, but it ended up hurting my eyes and I did not use it.

But in the end, this monitor wasn’t that bad. This was my first monitor with a variable refresh rate, and G-Sync worked fine with Doom Eternal. I played the game and I was happy.

Food for Thought

But after murdering thousands of demons, I felt something was wrong. Is this really how display technologies are supposed to work? Why can’t I scroll in Age of Empires 2 and read the text at the same time? How does everyone just accept this? Am I having mental issues which cause only me to see this motion blur?!

In 2019 and the beginning of 2020, I was rather busy developing an Amiga game which was to be released in March 2020. Tiny Little Slug was finally up for sale after many grueling debugging sessions to fix all the bugs that remained.

I still operate my Amiga on a Commodore 1084 monitor even after all these years. It’s one of those 14" CRTs you just have to love because of the excellent image quality, bright picture, strong contrast and lack of motion blur. While the development was done using an emulator, the final testing of the game was performed on a real machine. Through that, I was exposed to the smooth picture on the CRT. The Commodore 1084 outperformed all recent gaming monitors in terms of motion clarity. 30 years, and TFTs are still playing catch-up.

I tried to remember when and why we all swapped out our PC CRTs. I still remember playing CS 1.6 with friends in my parents’ basement. A LAN party with 6 adolescents. We all had CRTs, except for one friend who used an early-generation TFT. During that time, I had to ask my father every time I needed to move the monitor – it was simply too heavy for me. We still played on LAN parties later, when everyone had switched to TFTs. I also remember playing Half-Life 2 on an early TFT. But what happened? Why did everyone switch from CRT to TFT? Nearly all my memory on that topic is gone. Last I can remember is that those early 1024x768 TFTs were kind of slick, cool, or maybe “new”. It was the new “next thing”, and people liked the “new” stuff?

I‘ve decided to ask other people for their opinion. Co-Workers, friends and family. “Those were quite heavy and the devices were too deep” was the aggregated result about the death of picture tubes.

I personally would never care about the weight or size of something if there is no alternative in terms of quality. I also thought this would count for others as well. So what happened?

Were TFTs that much brighter? Were black levels in room-light conditions that bad? Was convergence a real issue even with high quality monitors? Was the sharpness of the picture not that good?

It seems especially the last question really depends on the used resolution.

I started to grow more into this topic and while I do own 2 Amiga monitors which are display devices for PAL and NTSC 15 kHz video signals, I haven’t used a PC CRT for quite some time. At the end of May this year I used eBay to gather some free CRT monitors for PC usage. The first one, a hp p930, was rather dim and thus not very helpful. Also the flat tube had linearity issues. The second one, a Siemens MCM 1706 (17”), was rather small but offered great brightness and a sharp picture as long as the resolution was not higher than 800x600. The tube could be also driven with up to 1600x1200@70Hz but the picture quality suffered badly. And so I experimented.

3 monitors in a row. 2 TFTs and 1 CRT.
A thing of sluggish beauty. Left to Right: Dell 2412M, Asus PG279QE and Siemens 1706 MCM

When Carrion came out this year I was kinda fascinated that the internal rendering resolution was 640x480 while the frame rate was locked at 60 Hz. At the same time it was scrolling rather fast. This made the game look awful on my very expensive Asus PG279 but looked like a blast with the native resolution on the free 17” Siemens CRT. I was quite happy during that period of time that I had a CRT to play this game.

Games of the post-2003 era

Fast Scrolling Games...

While looking for games which could suit a CRT I noticed that most of the current games don’t scroll very fast. Recent 2D platformers like Hollow Knight or Bloodstained do have a rather slow scrolling pace. Almost like… they were optimized for monitors that are known to have motion blur.

One could say games have changed. Developers have adapted from the quirks a CRT had in the past (horizontal scanlines) to the quirks the current technology has (motion blur).

The Future

Now, 17” is not big measured by current standards. And either the focus unit or the shadow mask of my Siemens 1706 is not suited for higher resolutions. But there were other monitors like the Sony's GDM-FW900.

People seem to crave for this monitor. They want it back. High prices are paid for them. But the issue is that these are old. You don’t really know what to expect with probably aged tubes.

At this point I was wondering. Why is there no movement in bringing CRT monitors back. Are the disadvantages of the past still part of the present? How would the picture look like today?

I dream about a 27” G-Sync compatible CRT monitor with high resolution. But how high could the chance be to revive the machinery to make them? How high would the demand be? How expensive would this dream monitor be?

If I had more courage I would kickstart a new CRT monitor. But who am I? Just an embedded software engineer without any manpower behind. I could probably design the firmware, video processing algorithms, OSD, deflection control and all that stuff that would belong to such a device. But the rest?

Glass, Vacuum, Chemical stuff with Phosphors, Etching of Shadow Masks and analog circuits. Combine that with the complex process of actually getting phosphor dots on the screen.

It’s too complex for one person.


Maybe this dream stays a dream. Maybe I currently have a weird quirky phase. But the state of the art of displays make me angry. Sample & Hold type of displays need to leave the gaming market now and make place for something better suited for moving pictures.

TFTs still have their place. In offices to read text with great font clarity. But for the rest?

OLED monitors are still not available and who knows when they finally arrive and whether they will solve the problems.

Micro LED is a thing of the future and there is no first monitor in sight.

What are you thoughts? How do you like your games?


Wednesday, November 27, 2019

Coroutine68k - Stackful coroutines for Bare Metal 68k Applications in C++

Recently I've read through the enhancements the C++ fans get with C++20.
I was especially interested in the concept of coroutines as it is an approach for asynchronous programming and an alternative to state machines. The latter always have the problem that they are unstructured and it sometimes is difficult to "see" the flow of the states. On the other hand a synchronous algorithm as a structured program might be easier to understand.

The GCC is still not ready for C++20 so stackless coroutines are currently only possible using the mechanism used by Protothreads. But stackful coroutines on the other hand just need some context switching which can be implemented inside a library.

As I'm currently on a 68k trip I wondered how easy it would be to create a similar thing for that processor. Last weekend I've made a small case study and published it as a small open project on github. Enjoy.

Friday, September 20, 2019

Connecting an Amiga Monitor to a modern Nvidia GTX 960

The methods described in this article can generate malicous video signals that are able to damage your CRT.
Please apply anything written here with care and thought. I can't be held responsible for damages on your hardware.
That said, if your monitor is showing strange rolling screens or makes noises that it has never made before, SHUT IT DOWN immeditatly.

Well, let's start....

How to use an RGB Monitor with ~15 kHz horizontal frequency on a modern graphics card?
I don't know if this topic was already addressed at various sites and forums on the internet. Some MAME enthusiasts seem to try to get this running. But with some random scripts from the internet I had no luck. So here is how I did it.

First of all? Why?
Simple! Native Amiga video modes on a PC for proper visuals with emulators!

Of course this means that an unusual horizontal frequency is needed. While standard VGA modes rely onto ~31kHz, a lower frequency is not reachable with standard methods as such a modeline will most likely be not accepted by Xorg.

Also of course you need of connecting your PC to your Amiga monitor.
There do exist passive cables with VGA on one side and SCART on the other.
As my GTX 960 doesn't offer a VGA port I've also used a DVI-I to VGA adapter without issues.

How to enable flexible clock rates

I assume that currently no xorg.conf file is used on your system.
This is quite common in in the current Linux world as the modes are now auto configured.

Open nvidia-settings.
Click on "XServer Display Configuration"
Click on "Save to X Configuration File" and save a folder you are allowed to.

Edit the file and search for this line:

    HorizSync       30.0 - 83.0

Replace 30.0 by 13.0 and save that file to /etc/X11/xorg.conf

Restart Xorg and pray that it's still running.

You are now able to set horizontal frequencies below any VGA standard.

Add some modes

As I live in a PAL country I'm used to PAL video modes on my Amiga system.
There are 3 common video modes which were used.

320 x 256 progressive, nearly square pixels, mostly used for games
640 x 256 progressive, pixels with half width, mostly used on workbench
640 x 512 interlaced, the perfect way to damage your eyes

At least for games 320 x 256 is probably a nice resolution to work with.
640 x 256 is supported by Xorg but I haven't found any program which is in harmony with that mode. It seems non square pixels are not common any more.

Now onto some modelines you can add with xrandr:

These modes have a similar timing to the Amiga ones on a 1084 monitor.
You don't need to readjust the monitor to switch between your Linux system and your Amiga:

xrandr --newmode 320x256_50.08    7.09379   320 363 388 454   256 273 278 312  -hsync -vsync
xrandr --newmode 640x512_25.00i  14.18758   640 720 770 908   512 542 547 625  -hsync -vsync interlace

If you like to have some overscan this can be used:

xrandr --newmode 768x576@25i     14.75      768 789 858 944   576 581 586 625  -hsync -vsync interlace

For NTSC these can be used. I haven't tested them with an Amiga emulator though. Please keep in mind that these resolutions are not exactly the ones you find on an US Amiga model. Use them as a starting point.

xrandr --newmode 320x210_60.12    7.15909   320 363 388 458   210 225 230 260  -HSync -VSync
xrandr --newmode 640x210_60.12   14.31818   640 726 776 916   210 225 230 260  -HSync -VSync
xrandr --newmode 640x435_30.07i  14.31818   640 720 760 907   435 456 460 525  -hsync -vsync interlace

Again some overscan:

xrandr --newmode 720x480@30i     13.5       720 736 799 858   480 486 492 525  -hsync -vsync interlace

Now as the modes are created we need to add them:
In my case I have do to this:

xrandr --addmode DVI-I-0 320x256_50.08

And to switch to that mode:

xrandr --output DVI-I-0 --mode 320x256_50.08

Your monitor should now show the upper left of your desktop.
Use something like this to create an extended desktop to have a different view on your CRT.

xrandr --output DVI-I-0 --left-of HDMI-0 

I've tested this configuration with fs-uae in full screen mode and the results seems to look like it was executed on a real Amiga which is some really cool shit.

To avoid tearing in fs-uae you need to add video_sync = 1 to your .fs-uae config file. Without it, a tear line is very slowly traveling down the screen. :-(

Monday, September 16, 2019

Custom git diff tool for Tiled

As I'm currently developing Tiny Little Slug and I use tiled for the levels I had the frequent issue that I wanted to compare .tmx files over the history of the game. As .tmx files are rather difficult to parse with the human brain I thought it must be possible to get an automated image export from tiled. And I was lucky. Here is my approach:

Create a text file with this content. I name it tileddiff.


tmxrasterizer "$1" /tmp/local.png || exit 1
tmxrasterizer "$2" /tmp/remote.png || exit 1
compare /tmp/local.png /tmp/remote.png /tmp/view.png
eog /tmp/view.png

Save it at a preferred place and make it executable.
Edit your ~/.gitconfig and add these lines at the end:

[difftool "tiled"]
    cmd = /home/derp/bin/tileddiff "$LOCAL" "$REMOTE"

Then use this call to let's say compare the current state to the last one.

git difftool -t tiled HEAD~1

You should see a nice visualization where all changed tiles are highlighted in red.

Possible Problems:
Please keep in mind that this only compares .tmx files. If you have external dependencies like .png or .tsx files this approach will probably not work.
Differences between .tsx files are easy to observe though.
And comparing .png files is a different story. But the approach presented here will work for .png files as well ;-)