Relaying/Forwarding ports from one Windows server to another

Yesterday I migrated one of my services from one server to another. Since the protocol used by the service does not support a HTTP-esque redirect and the Windows Server version used did not have the RRAS roles available, I had to get a little creative.

Enter Komodia Relay, a great (and free, to boot!) tool to forward a TCP/UDP port to a different system. The basic idea here is that it works like a proxy, clients connecting to the old server will transparently be proxied to the new one through Komodia Relay.

Usage is outstandingly easy and even under loads of several hundred connections the application still performs beautifully.

If you are more the GUI-oriented type and do not mind to pay for your ride, Network Activ’s AUTAPF might be worth checking out.

Howto: Titanfall with Steam Overlay

I am not a big fan of EA’s Origin. The software itself misses a simple list-view and looks like it loves to tell me how little it thinks of me. One of the prominent examples of this behaviour is that I cannot use Steam overlay and Origin overlay. It’s either Origin or nothing. Up to now.

With Titanfall being released I have one more game in my Origin library I am probably going to play quite a bit. So I while looking around I found the usual subpar solution of adding Origin.exe as a non-Steam game. Unacceptable.

Thankfully I came across NoFate’s wonderful homepage and his PAR remover. Simply navigate to your Titanfall game directory, make a copy of the original Titanfall.par and upload the original PAR file to NoFate’s PAR remover site. You will receive a new PAR file that will allow you to directly start Titanfall – and use the Steam overlay.

But what about your friends on Origin? They will still see you playing the game, they will still be able to join you – but you cannot use the Origin overlay anymore. Well, that’s a shame but does not bother me as much because most of my friends are on Steam.

WebDrive: Increasing the “Total Space” value for a drive

South River’s WebDrive is one of the most important tools for me. It connects to several servers and mounts them as drives on my Windows machine.

If you work with a WebDAV or FTP connection and do not have quotas enabled, WebDrive will, by default, assume a total capacity of 100GB for the drive/connection. Especially if you are moving tons of files, 100GB is nothing and the limit gets in your way.

Thankfully you can set the limit per connection via the Windows’ registry:

  • Start regedit
  • Navigate to HKEY_CURRENT_USER\Software\South River Technologies\WebDrive\Connections\YOUR_CONNECTION_NAME
  • Set the QuotaMB key from 102400 to something else, i.e. 1024000

After disconnecting/reconnecting, the new limit should show up. Cool stuff.

Synology DS2413+ Review

A colleague once told me that building your own storage-server is way too much work. “Just order one,” he used to say, “it’s not worth the time and the trouble. Just unbox, pop in the disks, install and you are good to go”. That was seven years ago and I remember arguing about SOHO use-cases where a small NAS would have been too little and a rack-mounted storage would have been too much. “Just get two smaller units,” he laughed at me.

As it turns out he was right. While I was busy replacing obscure hardware, sniffing through HCLs and tinkering with different OpenSolaris’ upgrade paths (side note to myself: Never again upgrade to Solaris Express, go with OpenIndiana!), he called the manufacturer’s tech-support and was good to go.

Almost a decade later I am older and (arguably) a little wiser now. To replace my patchwork Solaris file-server I decided to go with something pre-made: The Synology DiskStation 2313+.

On paper it does everything I need:

  • Comes with 12 hot-swappable 3.5″ SATA disk bays
  • Small, non-rackmounted form factor suitable for storage in offices
  • Supports growing the total volume by replacing a number of disks (combination of lvm/md)
  • Supports encryption (Note: Only via SMB, no support for encryption via NFS!)
  • 2x 1GB Ethernet ports (LACP supported)
  • Support for Infiniband-based expansion with a second unit, giving me a [theoretical] total of 24 bays
  • Intel x86 architecture system with 2GB of memory (can be upgraded to 4GB)

The base unit without any disks set me back 1200 EUR. Instead of continuing the tragic history of getting the largest consumer hard-disk I could find, I opted for longevity by choosing 12x Seagate Constellation CS 2TB drives, giving me 18GB of usable storage in a SHR2 RAID6 configuration. The disks set me back another 1200 EUR, an investment well worth it (I hope?).

So the first conclusion we can draw here: If you want to fully use the DS2413+, it’s not a very cheap device.

The build-quality of the device is pretty nice with no cheapo plastic parts on the exterior. The disk trays are well made, have no splinters, rough edges or deformations so disks slide right in and sit on a nicely padded base.

Synology ships the DS2413+ with a number of stuff; the only noteworthy being the included ethernet cable: A 2m CAT5e cable – haven’t seen one of those in years.

The disk bay can be locked with one of the two included keys. There is no way to lock individual disks, only the entire bay.

After starting the DS2413+ for the first time it needs to install the operating system, Synology’s Linux-based “DSM”. Installation is simple, browse to the DS2413+’s IP-address and follow the web-based wizard which will download the newest DSM automatically. About 10 minutes later the device was online.

You can configure the entire device through a nice-looking web-interface. DSM takes some strong cues from OSX in terms of it’s UI design. If you have ever used a Macintosh with OSX you should have no problems finding the options you want.

Synology gives you the option to install additional packages to extend the functionality of your NAS. Unfortunately all packages get installed onto your storage pool, so when you swap all disks, the packages will be gone. This is a major problem for me, the DS2413+ does not have a dedicated system drive.

The packages range from useless stuff like cloud-backup or media-streaming to Python, Perl or Java. You can install a LAMPP stack on your NAS if you wish to do so. Honestly, this looks more like a gimmick than a really useful feature, especially considering the Linux flavour on the DS runs a bare busybox with a few additional binaries.

The volume management is where things get interesting. Since this is a Linux system, there is no ZFS. Surprisingly, the only file-system supported by DSM is ext4. There are some HFS tools installed as well but they are useless for my use-case and I did not spot any option to create HFS+ volumes.

The DS2413+ supports all common RAID levels and sports it’s own lvm/md-based “SHR” RAID level which allows for dynamic growing of volumes.

I hope that the introduction of DSM5 in January 2014 will bring the option to migrate to btrfs. I enjoyed the option to snapshot file-system states and it has come in handy several times before.

Network performance is okay. LACP works, the setup is a little bit weird and throws away the first ports configuration instead of using it as the aggregated adapter’s configuration, though. It may just be a Linux thing.

SMB2 performance seems to suffer quite a bit when the device is busy, FTP and/or WebDAV do work fine in these cases. NFS works – except on encrypted folders. There are no SMB-to-NFS permission problems.

When changing SMB or NFS options, the DSM will restart all sharing services, meaning that if you change an SMB option and have a client connected via NFS, the client will be disconnected as well. Meh.

So, am I happy to have this device or would I recommend to roll your own build? Simply put: I am happy. There is much to see and tinker with, I have not mentioned any of the energy-saving options or the sound-levels of the device. Both are great.

There are a few nitpicks but the overall build-quality and software is fantastic, making the device easily usable for all target-groups. The option to extend the DS2413+ with another unit via Infiniband is a great idea and hopefully the extension unit will still be for sale in a few years.

Whether you are a passionate home-user with hunger for storage or a small business unwilling to get a rack, the DS2413+ is worth your attention. Otherwise there are plenty of great rack-mounted options for the same price that do the same.

Change the locale in Battlefield 4

I won’t humor my esteemed readers with my personal opinion on Battlefield 4 but there is one thing I must get out of the door: Localization in games usually sucks. And it sucks if developers, publishers and digital distribution channels alike will not give you the option to change the language of the game.

Thankfully Battlefield 4 can be reset to English in numerous different ways. You have probably read about deleting the unnecessary extra locales (everything under Data\Win32\Loc that does not start with en*) but unfortunately these files will be restored on the next patch.

A much better and less intrusive way is available by altering your registry:

  1. Start regedit
  2. Navigate to HKEY_LOCAL_MACHINE\SOFTWARE\EA Games\Battlefield 4
  3. Set the “Locale” to “en_EN”

This should work on every localized version and teach the game some manners.

SyncBackSE: Schedule a Move Operation on Windows

I have several file-system operations I cannot perform during the day, the machine’s performance would suffer and I would get angry e-mails. So I have to schedule simple move operations.

Now I could do this with Windows’ own task scheduler but I would have to write either a vbscript or a batch file to specify the details. Performing a dry run also sucks. Apparently there’s no dedicated software that gives a new “Schedule Move” or “Schedule Copy” context operation (hint: I’ll develop one once I have beaten Grand Theft Auto V) for quick use, so I started experimenting.

It seems the amazing SyncBackSE fits the bill. I already own a license for this great piece of wizardry to perform sync operations between multiple machines and backup my files. Turns out you can configure a new, one-time job to be your scheduled file mover:

  1. Create a new backup profile and choose the directory above the one you want to move.
  2. Choose “Select Subdirectories and Files” to specify the directory/directories you want to move.
  3. Now select your target directory.
  4. Add a schedule
  5. As a condition set “Move file to target”

SyncBackSE will automatically move your file, produce a nice log for you to review and even allows for a dry run.

Component is no option for me.

I’m a quality whore. Give me quality. What, is costs 5 additional bucks? I think you did not hear me. I said. Give. Me. Quality.

When I got my Hauppauge PVR2 GE Plus it only came with component multi-av cables for the Playstation 3. “Big deal”, people say, “you don’t see the difference on YouTube anyway.”

That may be true if you play Battlefield or Call of Duty all day. However I almost wanted to cry when I saw my glorious Playstation outputting mushy pictures…


(It’s Jena-san from Planetarian, I’m sure!)

As this small video should demonstrate there are quite a few differences, starting from the foggy picture to the blurry outlines to the ashen white color to the strange color impacts. Simply put: Playstation 2 era quality.

My advise: Grab a simple HDMI-to-DVI adapter cable, one of those DVI+Toslink-to-HDMI converters and output beautiful, sharp material. Even if you think the quality won’t be visible after your post-processing, it is still visible on your own television set during your recording, at least.

My streaming setup

As you have probably noticed if you follow my projects for an extended amount of time, I do love streaming. The idea of personal media has become an incredible creative influx in today’s web culture. Think of great podcasts, let’s plays and weekly shows you enjoy.

Since I’ve changed my workflows and my software stack around a bit in the past few months, here is a small look at how I work. I am not suggesting this setup is generally awesome (because it is clearly not) but at least it’s a solid, mostly software-based foundation.

I’m not the average player who simply streams his progress. I am too lazy for producing a continuous series of videos. Another problem is that gameplay is – in my opinion – only interesting at 720p and/or higher resolutions. Unfortunately, due to the poor cut-throat politics of the German Telekom, it is impossible to get proper broadband internet access. I have to put up with ~100 kiloByte/s upload and mere 10 Megabit/s downstream. The best I can manage with that is 480p with about 700-750kbps video data and 96/128kbps AAC audio.

However, I do want to be able to record in high-definition anyway. Ideally I record in 720p@30 and stream in 480p@30 – in realtime, that is. Technically this should not be an issue, my computer supports Intel Quick Sync so I could (in theory) encode my local high-definition copy of a video without suffering any performance penalty. I specifically mention “in theory” because reality leaves me in despair.

In the past I have used Dxtory and Xsplit to stream. Dxtory can output data to both file and a pseudo-camera. The camera output could then be used in Xsplit to stream in 480p. Unfortunately Dxtory does not give any specific resolution details to it’s camera output so the content is always 4:3 and blurry as hell in Xsplit. That may be an acceptable short-term solution for 480p crap quality but no keeper. Another bummer is that Dxtory does not make use of Quick Sync. The same is the case with Xsplit (except when doing local recording – which renders the entire feature moot).

I also want to mix several input sources (like multiple webcams, microphones, my Hauppauge PVR2 plus local media files [avi, mkv] etc.) so my choices are rather limited. Again, I use Xsplit as my weapon of choice here. I have tried Open Broadcaster Software and while the software did perform well, the user-experience and some kinks with capturing DirectX and OpenGL surfaces once again left me in despair. Capturing an exclusive madVR surface is impossible with OBS in it’s current state, there is flickering all over the place.

So yeah, Xsplit it is for preparing and switching stages. Starting with Xsplit 1.3 it has also become a useful tool for local recording due to Quick Sync support. Again, I could use OBS here or even Mirillis Action! but I already own an Xsplit license and there’s too little difference in the output to warrant extra software setup.

As mentioned before I use a Hauppauge PVR2 Gaming Edition Plus device to capture HDMI and Component material from my Xbox360, Playstation 3, Nintendo Wii and Playstation Portable. It works fine, the quality is acceptable, even if the blurry Playstation 3 component output makes me cry. One little thing has to be noted: The PVR2 has a streaming lag of about 3750ms.

It would be rather ugly to have commentary about 3-4 seconds early, so I manually keep my microphone input in a 3750ms Virtual Audio Cable repeater buffer that also allows me local playback in realtime from my line-in while adding latency to the audio data for use in Xsplit. It’s a great piece of software, I’ve fiddled around with VB-Cable before but VAC is just a much better experience for me. Your mileage may vary, especially since my  requirement here is inducing latency while most people want to reduce latency.

So, what’s left to do? Well, I still need to get a proper microphone that does not sound like I’m trapped in Buffalo Bill’s basement. I also need an additional, dedicated SSD for dumping the video data. And yeah… proper upstream – the one thing I will never get.

In short:

- Dxtory for capturing “strange” sources
- Hauppauge PVR2 for capturing consoles
- Virtual Audio Cable for mixing, splitting and postprocessing live incoming audio data
- Xsplit for bringing all sources and media together

I am not saying that the software listed above is perfect or the best there is. God knows Xsplit is far from perfect and OBS shows SplitMedia Labs who is boss in some departments (and no – “having more features” is not a good excuse for having the world’s slowest UI or not implementing features supported by libx264 [like OpenCL]). But my workflow could be a lot more miserable, so I guess this could pass as a recommendation.

Bring order to your chaos with File Juggler

I’m a sucker for sweet file-management tools. My ever-growing/changing list of essentials has a new addition and I welcome the fabulous File Juggler. File Juggler is a rather simple, yet powerful tool that allows you to define rules based on file-names, modification date and other criteria and perform operations on those files.

The reason I decided to shell out the 25$ for the tool is because it just works. No bells and whistles, no stupid, overloaded crap UI. Select a few sources to monitor, define your rules, done. File Juggler will automatically keep watch of the files and move, delete, rename or extract them when the rules apply.

In the current version 1.3 you cannot move entire folders around, unfortunately. So if I wanted to move .\a\b to .\c\b the files from .\a\b would end up in .\c\. Fortunately the developer behind the application is already working on folder operations for version 1.4, so I have high expectations :) .

Do not overuse rich notifications

One of the more prominent changes in Google Chrome 28 is the addition of rich notifications. Unlike the standard Webkit notifications these enhanced blurbs can contain more content and have a real sense of interactivity to them because they can house buttons, images and lists – without the need of loading markup from files.

It’s pretty clear that rich notifications will greatly enhance many extensions. On the other hand it’s also clear that inept developers will use them lightly just for the sake of using them – to harmful effect.

Imagine some sort of instant messenger extension. Every time you receive a message you will get a rich notification, allowing you to see the name, avatar, message and message history of the contact and allowing you to quickly hit “Reply” to send a response. Cool stuff and infinitely useful.

Bitcasa Everywhere Login Notification

Beautiful – but is it useful?

Now imagine you get a notification each time a contact comes online or goes offline, allowing you to easily start a conversation with just the click of a button.

Why differentiate the two notifications? Rich notifications, contrary to the standard Webkit notifications, are persistent until they are being dealt with. That means your notification counter will increase until you either hit the “Reply” button or close the messages from the notification tray (or the extension destroys the notification itself). It also means you do not need to show two distinct notifications for John Doe coming online and John Doe going offline. You can simply update John Doe’s status notification.

That may works for one extension that spams the notification tray. It certainly will not work for five. You will simply lose track of important notifications (despite the ability to prioritize them) because of the sheer number of junk notifications.

In my opinion these junk notifications should continue to use the Webkit notifications – if they need to be shown at all.

Windows web stack woes

For quite a while I was not satisfied with the performance of one of my Windows 2008 servers. While the machine had reasonable processing power, a fair amount of RAM and almost no disk IO the rendering performance of PHP pages on IIS 7 was simply atrocious.

Different PHP versions, lots of TCP and Wincache tweaking – no cigar. What could possibly cause the server to wait for about 8 seconds to render a simple WordPress front page?

The answer puzzled me: localhost.

Due to the IPv6 address of the machine, some kinky routine preffered the IPv6 address over the IPv4 one, causing significant slowdowns on each and every request to MySQL.

After simply replacing “localhost” with “127.0.0.1″ in all configuration parameters I got the kind of snappy performance I expected. Crazy stuff.

Transitioned

After a period of transition I finally decided to fully go with blog.tsukasa.eu and redirect requests from tsukasa.jidder.de to here.

That way links won’t be broken and I can finally utilize all the modern shennigans I’ve installed.

Future ahoy!

Edit 2013-06-15: The missing comments from the transition period are also on board now, me hearties!

The 80s called, they want their Hulk-Hogan muscle shirt back.

Auto-login from Bitcasa Everywhere

One feature I’ve been working on for a while is automatic login to my.bitcasa.com. To operate correctly Bitcasa Everywhere needs to be logged-in but the process of actually logging-in on each start of the browser is the user’s responsibility. It makes sense from a security point of view but is inconvenient for me.

So how can we automate this?

Since I already have a rather nice foundation for modifications on Bitcasa Everywhere I decided to implement this as one additional feature into BCE.

There’s one minor annoyance, though: The login form contains two hidden values that we need to keep track of.

Due to Chrome’s security constraints it is not possible to use an iframe and tinker with the values from our bg.js, so that idea is out.

That leaves us with jQuery’s .load() – and indeed, this works fine. Of course we do not want to actually show the login site, so the data needs to go into a pseudo-div:

  var loginPage = $('<div>');
  var form_csrf_token = "";
  var form_code = "";

  $(loginPage).load('https://my.bitcasa.com/login?interface=mobile',
                    null,
                    function(response, status, xhr) {
                      form_csrf_token = $(loginPage).find('input[name=csrf_token]').val();
                      form_code = $(loginPage).find('input[name=code]').val();
                    }
  );

For some reason the success function never seems to get called, so I had to workaround this by refactoring the extraction you see above into a new function I can call from bg.js. The real login happens on each badge update:

  // Process automated login
  if(JSON.parse(localStorage["auto_login"]) && renewedLoginToken && notLoggedIn == true)
  {
      form_csrf_token = $(loginPage).find('input[name=csrf_token]').val();
      form_code       = $(loginPage).find('input[name=code]').val();

      if(form_csrf_token != undefined && form_csrf_token != "0" && notLoggedIn == true)
      {
          $.post("https://my.bitcasa.com/login?client_id=None&interface=mobile&redirect=http://my.bitcasa.com/",
                 { password: localStorage["bitcasa_password"], user: localStorage["bitcasa_username"], csrf_token: form_csrf_token, code: form_code },
                 function() {
                    autoLoggedIn      = true;
                    renewedLoginToken = false;
                 });
      }
  }
  else
  {
      extractLoginParameters();
  }

Bingo, fully working auto-login for Bitcasa Everywhere. I just love Chrome and jQuery. :)

You can grab the entire source-code here. Open the Bitcasa Everywhere options from the Extensions page and configure the feature as you see fit!

Bitcasa Everywhere Chrome modification for infinite queue

The Bitcasa Everywhere extension is nice but unfortunately limits you to a maximum of three parallel downloads with no queued items.

To workaround this problem, I’ve modded the extension a while ago to allow for infinite queued items. The basic idea is to check whether Bitcasa still reports 3 active downloads and if that is not the case the extension will automatically start a new download. That does mean that if you get the infamous “an error occured” http status 500, this modification is dead in the water and will not work.

Apart from the queue, the mod puts download/queue indicators into the toolbar badge, moves processing from the popup to the background and comes with a handy way for reordering multiple queue items (CTRL+click on multiple items, then drag them around to reorder).

The mod is also able to automatically log you into Bitcasa so you don’t have to manually do this every time you start your browser and has a little RSS feed that you can use to stay up-to-date with Bitcasa’s client releases.

There’s even experimental support for put.io to pull your files from put.io to Bitcasa now – all automated, of course.

bce_mod

To add items to the queue, paste the links into the popup’s URL input; if you want to start a download directly (and don’t have 3 active downloads), just right-click on an item and select “Download to Bitcasa”.

Since this is a client-side mod, you will need to keep Chrome running to process the queue.

Download:

Instructions:
Extract this package to your Google\Chrome\User Data\Default\Extensions\jbebdjcjllheeclffnofhgcimmlkkbon folder in your profile and overwrite the original files to install it. Start Chrome, right-click on the Bitcasa toolbar icon and select “Options”. Configure the options as you see fit and save – even if you do not change a thing. This step is necessary to register the default values and to ensure you do not end up with options you did not want.

Notes:
If you enable the “Update Feed” feature, the extension will pull this RSS feed whenever you open the popup. Despite the fact that the webserver will not log your access you should be aware of this. When the “Update Feed” is disabled, no external request will be made. The source-code to the feed generator is also available (see this link for the live version of the main program).

You will probably notice some smaller glitches – please do not report them, I know about them. That does not mean I don’t care – it means I don’t need redundant information.

The archive contains the entire source-code. Please feel free to review it to ensure the modification contains no malicious code to steal your credentials.

Running Bitcasa Everywhere in the background:

To perfectly match Easy put.io for full automation you can configure BCE to run as a background application. To do so simply reload the unpacked extension:

  1. Go to Chrome’s extension management page.
  2. Click “Load unpacked extension”.
  3. Select the unpacked folder from the RAR archive (Note: Please unpack it to another location, i.e. your desktop!).
  4. Chrome will inform you with a notification that the extension has been installed as a background application.

In case you do not want BCE to run in the background anymore you must uninstall the entire extension, install the original Bitcasa Everywhere and replace the files as described without loading the unpacked extension.

put.io Notes:

Beginning from 20130809 the modification contains experimental and quite early support for pulling files from put.io directly into Bitcasa. To use the feature, simply tick the checkbox and fill out the OAuth token. Since this modification is not listed as an put.io application, you need to use an extension like Easy Put.io to generate a token.

For Easy Put.io simply authorize the application, open the Easy Put.io popup, right-click, select “Inspect Element”, go to “Resources” tab, select the Chrome extension local storage and copy the value of putio_token into the BCE Mod options.

Please note that the Bitcasa Everywhere API is unable to create folders, all your put.io files will be sent to the Downloads folder within your Infinite Drive.

Files from put.io will have an [put.io] prefix in their file-name within the queue. The prefix is just cosmetics, the actual file-name will not be altered when transferring.

Changelog:

  • (20130913) Added put.io ignorelist to prevent files with matching names from being queued.
  • (20130913) Improved the processing of put.io files (caches the last 1000 file IDs instead of the highest ID)
  • (20130809) Fixed the fubar’d update feed.
  • (20130809) Added experimental put.io support for automatically pulling data into Bitcasa.
  • (20130809) Altered context menu to be configurable, see new option.
  • (20130720) Added right-click “Add to Queue” menu as per request.
  • (20130622) Added experimental rich notification on startup when autologin is disabled and user is not logged-in. Requires Chrome 28.
  • (20130622) Added dynamic login/logout button in popup and unified some CSS classes. To use, fill out the username/password in the options but leave the toggle for automated logon unchecked.
  • (20130622) Added optional “Update Feed” for easier client-update checking.
  • (20130622) Revamped options page, should be easier to understand now.
  • (autologin) Added auto-login feature.
  • (mod) Added multisortable queue
  • (mod) Revamped popup UI and moved processing to background