SyncBackSE: Schedule a Move Operation on Windows

I have several file-system operations I cannot perform during the day, the machine’s performance would suffer and I would get angry e-mails. So I have to schedule simple move operations.

Now I could do this with Windows’ own task scheduler but I would have to write either a vbscript or a batch file to specify the details. Performing a dry run also sucks. Apparently there’s no dedicated software that gives a new “Schedule Move” or “Schedule Copy” context operation (hint: I’ll develop one once I have beaten Grand Theft Auto V) for quick use, so I started experimenting.

It seems the amazing SyncBackSE fits the bill. I already own a license for this great piece of wizardry to perform sync operations between multiple machines and backup my files. Turns out you can configure a new, one-time job to be your scheduled file mover:

  1. Create a new backup profile and choose the directory above the one you want to move.
  2. Choose “Select Subdirectories and Files” to specify the directory/directories you want to move.
  3. Now select your target directory.
  4. Add a schedule
  5. As a condition set “Move file to target”

SyncBackSE will automatically move your file, produce a nice log for you to review and even allows for a dry run.

Component is no option for me.

I’m a quality whore. Give me quality. What, is costs 5 additional bucks? I think you did not hear me. I said. Give. Me. Quality.

When I got my Hauppauge PVR2 GE Plus it only came with component multi-av cables for the Playstation 3. “Big deal”, people say, “you don’t see the difference on YouTube anyway.”

That may be true if you play Battlefield or Call of Duty all day. However I almost wanted to cry when I saw my glorious Playstation outputting mushy pictures…


(It’s Jena-san from Planetarian, I’m sure!)

As this small video should demonstrate there are quite a few differences, starting from the foggy picture to the blurry outlines to the ashen white color to the strange color impacts. Simply put: Playstation 2 era quality.

My advise: Grab a simple HDMI-to-DVI adapter cable, one of those DVI+Toslink-to-HDMI converters and output beautiful, sharp material. Even if you think the quality won’t be visible after your post-processing, it is still visible on your own television set during your recording, at least.

My streaming setup

As you have probably noticed if you follow my projects for an extended amount of time, I do love streaming. The idea of personal media has become an incredible creative influx in today’s web culture. Think of great podcasts, let’s plays and weekly shows you enjoy.

Since I’ve changed my workflows and my software stack around a bit in the past few months, here is a small look at how I work. I am not suggesting this setup is generally awesome (because it is clearly not) but at least it’s a solid, mostly software-based foundation.

I’m not the average player who simply streams his progress. I am too lazy for producing a continuous series of videos. Another problem is that gameplay is – in my opinion – only interesting at 720p and/or higher resolutions. Unfortunately, due to the poor cut-throat politics of the German Telekom, it is impossible to get proper broadband internet access. I have to put up with ~100 kiloByte/s upload and mere 10 Megabit/s downstream. The best I can manage with that is 480p with about 700-750kbps video data and 96/128kbps AAC audio.

However, I do want to be able to record in high-definition anyway. Ideally I record in 720p@30 and stream in 480p@30 – in realtime, that is. Technically this should not be an issue, my computer supports Intel Quick Sync so I could (in theory) encode my local high-definition copy of a video without suffering any performance penalty. I specifically mention “in theory” because reality leaves me in despair.

In the past I have used Dxtory and Xsplit to stream. Dxtory can output data to both file and a pseudo-camera. The camera output could then be used in Xsplit to stream in 480p. Unfortunately Dxtory does not give any specific resolution details to it’s camera output so the content is always 4:3 and blurry as hell in Xsplit. That may be an acceptable short-term solution for 480p crap quality but no keeper. Another bummer is that Dxtory does not make use of Quick Sync. The same is the case with Xsplit (except when doing local recording – which renders the entire feature moot).

I also want to mix several input sources (like multiple webcams, microphones, my Hauppauge PVR2 plus local media files [avi, mkv] etc.) so my choices are rather limited. Again, I use Xsplit as my weapon of choice here. I have tried Open Broadcaster Software and while the software did perform well, the user-experience and some kinks with capturing DirectX and OpenGL surfaces once again left me in despair. Capturing an exclusive madVR surface is impossible with OBS in it’s current state, there is flickering all over the place.

So yeah, Xsplit it is for preparing and switching stages. Starting with Xsplit 1.3 it has also become a useful tool for local recording due to Quick Sync support. Again, I could use OBS here or even Mirillis Action! but I already own an Xsplit license and there’s too little difference in the output to warrant extra software setup.

As mentioned before I use a Hauppauge PVR2 Gaming Edition Plus device to capture HDMI and Component material from my Xbox360, Playstation 3, Nintendo Wii and Playstation Portable. It works fine, the quality is acceptable, even if the blurry Playstation 3 component output makes me cry. One little thing has to be noted: The PVR2 has a streaming lag of about 3750ms.

It would be rather ugly to have commentary about 3-4 seconds early, so I manually keep my microphone input in a 3750ms Virtual Audio Cable repeater buffer that also allows me local playback in realtime from my line-in while adding latency to the audio data for use in Xsplit. It’s a great piece of software, I’ve fiddled around with VB-Cable before but VAC is just a much better experience for me. Your mileage may vary, especially since my  requirement here is inducing latency while most people want to reduce latency.

So, what’s left to do? Well, I still need to get a proper microphone that does not sound like I’m trapped in Buffalo Bill’s basement. I also need an additional, dedicated SSD for dumping the video data. And yeah… proper upstream – the one thing I will never get.

In short:

– Dxtory for capturing “strange” sources
– Hauppauge PVR2 for capturing consoles
– Virtual Audio Cable for mixing, splitting and postprocessing live incoming audio data
– Xsplit for bringing all sources and media together

I am not saying that the software listed above is perfect or the best there is. God knows Xsplit is far from perfect and OBS shows SplitMedia Labs who is boss in some departments (and no – “having more features” is not a good excuse for having the world’s slowest UI or not implementing features supported by libx264 [like OpenCL]). But my workflow could be a lot more miserable, so I guess this could pass as a recommendation.

Bring order to your chaos with File Juggler

I’m a sucker for sweet file-management tools. My ever-growing/changing list of essentials has a new addition and I welcome the fabulous File Juggler. File Juggler is a rather simple, yet powerful tool that allows you to define rules based on file-names, modification date and other criteria and perform operations on those files.

The reason I decided to shell out the 25$ for the tool is because it just works. No bells and whistles, no stupid, overloaded crap UI. Select a few sources to monitor, define your rules, done. File Juggler will automatically keep watch of the files and move, delete, rename or extract them when the rules apply.

In the current version 1.3 you cannot move entire folders around, unfortunately. So if I wanted to move .\a\b to .\c\b the files from .\a\b would end up in .\c\. Fortunately the developer behind the application is already working on folder operations for version 1.4, so I have high expectations 🙂 .

Windows web stack woes

For quite a while I was not satisfied with the performance of one of my Windows 2008 servers. While the machine had reasonable processing power, a fair amount of RAM and almost no disk IO the rendering performance of PHP pages on IIS 7 was simply atrocious.

Different PHP versions, lots of TCP and Wincache tweaking – no cigar. What could possibly cause the server to wait for about 8 seconds to render a simple WordPress front page?

The answer puzzled me: localhost.

Due to the IPv6 address of the machine, some kinky routine preffered the IPv6 address over the IPv4 one, causing significant slowdowns on each and every request to MySQL.

After simply replacing “localhost” with “127.0.0.1” in all configuration parameters I got the kind of snappy performance I expected. Crazy stuff.

Transitioned

After a period of transition I finally decided to fully go with blog.tsukasa.eu and redirect requests from tsukasa.jidder.de to here.

That way links won’t be broken and I can finally utilize all the modern shennigans I’ve installed.

Future ahoy!

Edit 2013-06-15: The missing comments from the transition period are also on board now, me hearties!

The 80s called, they want their Hulk-Hogan muscle shirt back.

Bitcasa Everywhere Chrome modification for infinite queue

Addendum 2014-05-21: I received a mail from Bitcasa informing me that this modification polls Bitcasa’s services so much that it has undesired side effects. Contrary to what you might believe it was not a threat or any sort of lesson in legal issues but a simple request backed by very reasonable, technical arguments. If you are not familiar with how the modification worked, here is the short version: BCE Mod created a background timer that would poll Bitcasa’s endpoint every x seconds to update it’s internal status, log you in, trigger new downloads and so on. One person using this method is not a problem. Add an undefined number of people and the trouble starts. Every user with this mod increases the stress on Bitcasa’s web interface considerably due to the unending stream of requests. Now here is where my dilemma starts: I was out to improve the user-experience and show that it does not take much to do so, not to harm the service I want to prosper for years to come. Unfortunately though, that seems to be the case now, making the life of the good folks at Bitcasa harder – not cool. So please understand that I will not offer or work on this modification anymore. I do recommend that if you still use the modification, you should uninstall it immediatly because it will not work as intended anymore; all it will do at this point is lock you out of My Bitcasa for a few minutes due to the number of requests. If you are interested in…

  • An infinite queue for your Bitcasa Everywhere downloads
  • Automatic login to My Bitcasa
  • Tighter integration with 3rd party services
  • A more up-to-date Bitcasa client update check (possibly an official announcement for each new release via Twitter?)

…please vote for these features on the official feature request section! The more votes a feature gets, the better! If a feature is not feasonable you will receive official word on why it will not make the cut. Also consider voting for the addition of some kind of file-download extension to Bitcasa’s API, giving third-party developers more freedom to interact with Bitcasa without having to play the “middle man” for file caching. Again, sorry to everyone at Bitcasa for the inconvenience caused and sorry to everyone who came here expecting a turbocharger for their Bitcasa Everywhere!

Quick note: Bitcasa + prepaid credit-cards

I’m not a fan of credit-cards. Personally speaking, I think Paypal, despite all it’s flaws, is the slightly lesser evil.

Paypal gets the one thing right about payment online: Don’t allow charges without user authorization. That’s where credit-cards fall short, in my opinion.

Needless to say I was quite disheartened to learn Bitcasa only allows credit-cards as their method of payment (although the legal page hinted strongly towards that during beta). Luckily enough, services like Kalixa, Wirecard or Neteller seem to work fine with Bitcasa. While not a perfect solution, this does at least postpone the problem a year for me.

Once again, Wuala gets it right while others seem to fail miserably with the same tools at their disposal: Paypal recurring, Paypal subscription (usable without credit-card) and even Bitcoin are offered as payment options.

Bitcasa releases new client – to infinity… and beyond?

Remember Bitcasa? The guys who started last year with the daunting promise of infinite cloud storage for a fixed price of 10$/month? I tried the service back in February and wasn’t exactly thrilled, it felt more like a half-assed Dropbox clone with a truly dreadful software to manage your data. Another turn-off for me was that, at the time, it was available for Windows only, which is a no-go in this day and age.

Simply put: I did not care for the service in a long time until I got a rather interesting newsletter from Bitcasa a few days ago, highlighting their new range of clients.

Bitcasa now calls itself the “Infinite Drive”, a clever spin to highlight what their new client is all about. Instead of pestering me with a confusing GUI that makes no sense whatsoever I get what I have always wanted from the service: A Wuala-esque file-system integration via a virtual drive (on Windows).

Bitcasa Infinite DriveA client I can understand also means that I finally had a chance to actually use and test Bitcasa. Trying to upload Ubuntu resulted in me having to upload the entire ISO, so unfortunately there seems to be no Wuala-esque pre-upload check for file availability.

Bitcasa gets a big gold star for making the stupid sync/mirror thing the old client did by default an optional feature. This means files I upload will not automatically be downloaded on every connected machine (which, quite frankly, is the only sane thing!).

One gripe I have with this simple new client is that it does not offer to pause uploads. You can either use Bitcasa and it will block your upstream with it’s jobs or you quit Bitcasa and cannot use it.

Bitcasa announced that they will go into paid operation starting early 2013, I’m curious what payment methods they will accept and what payment providers they will work with (hopefuly at least one that does not require a credit-card!).

Bottom line: 10$/month for infinite storage (limited by your very own upstream capacity) is pretty sweet, the new client is a definite improvement over the old trainwreck.

I’m excited to see how this will work out for Bitcasa and whether or not the business model will survive over time. Because that’s what I expect from a cloud-storage provider: To actually stay in business and to keep my files safe. Whether Bitcasa will pull this off… we will see. 🙂

Pochi to Nyaa Soundtrack

Didn’t I just write about the Neo-Geo? Yes, and I realized that this year (or next year, depending on your interpretation of things) Pochinyaa celebrates its 10th anniversary.

In case you don’t know the game: Pochi to Nyaa is a puzzle-game like Sega’s Puyo Puyo; you align coloured “cat blocks” and send them away with a boom – much like Puyo or Tetris. The twist is that you can control how long you want hold off firing the chains. Want a 20 block chain? Do it… or at least try. Big chains send “concrete cat blocks” to your opponent’s field (again, much like Puyo or Tetris).

The game starts off calm und cute but quickly starts biting you in the bottom after the third or fourth round with a vicious AI that retaliates mercilessly.



What makes Pochinyaa great – apart from the gameplay – are the quirky visuals and the incredible soundtrack. In honor of its 10th anniversary I looked for my Playstation 2 version of the game, extracted, converted, tagged and uploaded the music for you.

pochinyaa

Nyaa and Pochi, the two mascots of the game

Whoever has the rights for the game now: You really should re-release it on newer platforms, Pochi to Nyaa is awesome and exactly the way a puzzle-game should be: cute as buttons and hard as nails.

Anyway, here is the soundtrack. Thank you ~nya!

Launching FBA’s MVS Mode

FB-Alpha is an awesome piece of software. For the games I play (mostly NeoGeo games) it’s damn near perfect because it has an emulated MVS mode where you can queue up to 6 Neo-Geo cartridges and cycle through them after inserting a token. The only thing that urks me is the absence of a dedicated MVS switch to start the emulation directly, without me pressing enter first.

Lucky enough that’s where AutoIt comes in handy:

[code language=”ahk”]
; +——————————————————–+
; | |
; | FBA NeoGeo MVS Launcher |
; | |
; | Launches FBA in MVS mode and automatically applies the |
; | last cartridge configuration used. |
; | |
; +——————————————————–+

Global $executable = "fba64.exe";

If FileExists("fba.exe") Then
$executable = "fba.exe";
Else
$executable = "fba64.exe";
EndIf

If FileExists($executable) Then
ShellExecute("fba64.exe", "neogeo");

While Not WinActivate("Select cartridges to load…");
Sleep(500);
WEnd;

WinWaitActive("Select cartridges to load…");
Send("{ENTER}");
EndIf
[/code]

Simply compile the script, drop it into your FBA directory and pre-configure the MVS slots in FBA. Launch the script afterwards… and yeah, that’s it.

For your convience there’s also a precompiled version available.

Be sure to set the Neo-Geo BIOS to the correct 6-slot system, otherwise you obviously cannot cycle through the games.

ZNC – Cannot see messages with multiple clients?

If you’re using ZNC 1.0 and connect with multiple clients you may have noticed that under certain circumstances you cannot see messages sent from one client on another.

The cause could be your module configuration, in my case I had to deactivate the CRYPT module to correct the behaviour as the module seems to block the message from being broadcasted to all clients. Bummer.

Relocating databases in Progress OpenEdge

One way to do it, if procopy is not an option, would be like this:

  • Copy the database and all it’s files (d*, b*, st) to the new location
  • Edit the .st file in a text-editor, replace the old path occurences with the new one
  • Open a ProEnv prompt and navigate to the database’s directory
  • Run: proutil <database name> -C truncate bi
  • Run: prostrct repair <database name>
  • Check whether all went well: prostrct list <database name>
  • If your admin server already knows the database, try to start it: dbman -db <database name> -start

A thing I noticed is the difference in output between dbman and the old Progress Explorer Tool. While the Explorer gives you some meaningful output, dbman often responds with DBMan022 which the Progress Knowledge Base refers to as a database error… duh.