Friday, January 25, 2008

Debugging Reporting Services install scripts (.rss files)

So I was making some changes to a deploy script, and brought up the .rss file in sharpdevelop, simply for the fun of intellisense.

I got bored, and decided to reference rs.exe ans see if it would decorate all my code for me.

I eventually realized the following:

All rs does is take the -v paramaters, make them into variables, through your .rss file in (its already a sub main), compile and execute it.

So, to step through your .rss scripts on-the-fly, or to completely replace them with a custom assembly, do this:

1) Create a new console application
2) Reference rs.exe in your project (in $SQLSRVR\blah\blah\binn)
3) Reference System.Web.Services
4) Imports Microsoft.SqlServer.ReportingServices
5) Declare your parameters as module level variables
6) Public Sub main() is the same code you'd put in the .rss file.

Hooray, no more stumbling in the dark, and your installer can be unit tested, refactored, resharped, signed, encrypted and converted to unmanaged C++, inlined assembly, or whatever your black heart desires.

NOTE:, rs.LogOnUser throws an error that it's not supported in RS for Sql Server 2000. I guess you can only run with the credentials of the launching user, or find some other way to impersonate.

Obligatory rant: why did have they give these files .rss extensions? Can they not spend a half a minute on google to see if rss already means something?

Oops, I forgot the important part, add this global instance of variable rs:

Public rs As ReportingService = New Microsoft.SqlServer.ReportingServices.ReportingService()

Wednesday, January 23, 2008

Free XBLA Game to Make up for Worst Xmas Ever

I suppose it's better than nothing, though I think they would have done better to just throw everyone a few xbox points, and let them choose their own download. This game was widely panned by critics and gamers, but hey, free is free.

Link to story here.

You only have until Sunday, and I suppose if you snooze you lose.

Tuesday, January 22, 2008

Apple

This was an interesting read. Apple "tweaks" their port of DTrace to prevent people from debugging iTunes, and the ilk.

I can only imagine it's to prevent people from grabbing unencrypted data from DRM'd music from memory? What's even the point of that, practically, Fair Play is long since broken. It's not like everyday Joe's are tracing through running code so they can make an illicit copy of some iTunes song, and it's not like developers wouldn't know a better way. At the end of the day, the tool doesn't work. Go Team Venture.

I think Apple just really hates programmers.

Hooray!! I Have a Playchoice 10 Countdown Timer

It's actually out of a countertop, but I shouldn't have a problem fabricating a bracket to mount it in my single monitor cabinet.

I'm going to have to make a cable, but I'm not sure if I want to find the proper connectors to do it. I'm leaning towards soldering on a 12 pin molex, there's nothing remotely OEM about my Playchoice/VS hybrid anyways. I should post a photo of the ridiculous perfboard monstrosity that I created to do the switching, I should probably sketch up a schematic for it too, before I forget what I did and it breaks.

I have PCB layouts I sketched to make a few more of them a little more professionally, and a little more advanced, but haven't really set up to do any sort of etching or drilling I'd need. I should look into some of the web-based pcb prototyping services again. I could get a dozen boards, sell 10 on eBay to break even. I know that multijamma guy has a patent, but my approach and design is wholly different from his. I ain't 'fraid of no ghost.

I really need to get back to my cabinets, parts are cheap on eBay this time of year.

Why not list things to do?

Bad Dudes' monitor is fritzing, it's the flyback transformer. I have a parts chassis and a cap kit for it, I need to get around to rebuilding it. I also need to look up the dip switch settings and put it on easy. I am not a bad enough dude to rescue the president.

Puck Man and Playchoice 10 both have at least one gun on the CRT flaking in and out. The PC10 is beyond repair, but I'm loath to put a regular monitor with an inverter in there. I may swap with the Donkey Kong's monitor, it's in great shape and I'm unlikely to be jumping over barrels any time soon - I'm going to need to spend lots of hours trying to get anything out of that board (or lots of money replacing it).

Puckman is probably fixable, but I haven't been able to ID the monitor. It's no doubt a cheap chinese knock-off, but it's incredibly well built and has a really stable and sharp image. They really don't build them like they used to. I've never seen a TV set or PC monitor that came near the build quality of those things. On the bright side, it looks like a good tube swap candidate, and I have a TV tube with matching pins and impedances, so I'll keep my fingers crossed.

Both my tables could use new joysticks, I need to really start watching eBay again. Regular sticks don't fit. I'm likely to find original replacements for the Time Pilots, since there were plenty of those old Centuri/Konami/Sega cabinets out there.

Puckman is a chinese bootleg circa 1979, though. It's buttons and sticks are downright weird. From what I can tell scouring catalogs, never at any time did any manufacturer sell pushbuttons of that weirdo diameter.

I need to build/buy a new eprom eraser, so I can go ahead and patch Puzzle Bobble 3x into english, and put the unibios on the Neo Geo. I'm not ashamed to cheat to see all the endings to Samurai Shodown.

THEN, I can get to work on my NES cartridge adaptor for the Playchoice 10.

It strikes me that a Neo Geo rom emulator is a cost effective proposal at this point, flash is cheap now, and it's effectively the same setup as a genesis cart. I should put that on my todo list. If I can pull it off, a CPS2 rom emulator would absolutely rock. I should check into the timings, I'm sure it would work for both.

On that front, I lost the code for superufo ultimate mega SNES rom tool 2000 that I wrote. Now I need to rewrite it, I can't use my UFO now without it. I wrote it years ago to split up SNES roms, format floppies to 1.68 megs as it wrote it out - for use in my floppy based SNES cartridge copier. I wonder if that code I found to format the floppy would even work under XP? I guess a 4 line shell script in linux could do everything it did - I stole the muscle out of ucon anyways.

So that's my story, I can read this later when I'm bored and maybe be inspired to do something. I feel lazy now.

CTRL-F Yourself

How come Visual Studio's find function can't search in classes, methods or namespaces? The closest you get is "current block".

For that matter, how come MS guys don't use class explorer. Isn't the logical structure of the application more important than the physical layout of the source files?

db4o and Transparent Persistence

So db4o supports transparent activation now, not actually instancing an object until it is first called. They claim the goal of completely transparent persistence has now been achieved. I worked with this a little before, about a year ago, and found it to be somewhat superior to the other ORM type sytems around. I'll have to play a little with the new transparency features, and see just how invisible it actually is.

db4o product announcement

Monday, January 21, 2008

MTU Issues

I have a linux based firewall (Gentoo w/ Shorewall) which keeps resetting the MTU on the internet interface to 576, seemingly randomly.

I put a directive in conf.d/net to force MTU to 1500, and manually typing "ifconfig eth0 mtu 1500" will work fine, but eventually it will reset back to 576. I'll generally notice either signing in to XBox live (which requires the big packets), or my wife will complain about her yahoo webmail (which seems to drop connections when it cant send the big packets).

The yahoo thing bothers me. I expect this behavior from XBox Live, since it's by definition a broadband only service. But how to dial up users manage yahoo these days? It doesn't really bother me, that was just a figure of speech. Dial up users can jump in a lake, but so can sites and services that require MTU's of 1500 - let the packets fragment you big babies.

Anyhow, if anyone knows what could be causing this issue, where to even look in the logs - I plead ignorance here, I don't even know which daemon is in charge of changing the MTU (dhcpcd alone?) I tried disabling traffic shaping to eliminate it as a cause, to no good effect, and I've swapped the net card and cables out to eliminate any hardware issues. If it's the cablemodem itself, I'm sure I'll be in for tons of fun trying to convince Comcast of that, though that wouldn't make sense - the issue is on eth0 of the router.

The problems seemed to start when the Vista laptop showed up on the network, but can Vista really be so bad that it can screw up linux' net code by osmosis?

(I fully realize nobody reads my day old blog no matter how awesome it is. Stay tuned for some world class animated gifs and midi backgrounds, I'm bringing back Geocities as web 3.0)

Stupid Preprocessor Tricks in C#

Did you know that this compiles:

#region methods

private

#endregion

void SomeMethod()
{
}



Neat, huh? You can just jam #region and #endregion directives everywhere, to make your code SUPER DUPER UNREADABLE. How about right in the middle of a statement? NO PROBLEM.

#region INSANITY
private void SomeMethod()
{
string x = "I like " +

#endregion INSANTIY

"bees";
}


Neat, huh? So, my question is simple... Is there any reason you'd want to put region directives right in the middle of a routine, let alone a statement?

Edit: it makes sense that it would be so, since region directives are essentially whitespace to the compiler. I just found it amusing (as in I screwed around for a half hour trying to figure out why I was getting a 'more than one protection modifier' error, a collapsed region was hiding an extra "public" keyword.)

Call of Duty 4

If I walk into one more invisible barrier, or empty another clip into another bulletproof office cubicle wall, I just may have to smash the disk with a hammer and jam the polycarbonate shards into the next jerk to tell me what a fantastic game Call of Duty 4 is. I cant stand when some items in the environment are "cover", and some are just for show. Basically, the task is to memorize each firefight, over and over and over, until you come out on top. There's no reacting on the fly, scraping through by the seat of your pants... there's no fun. This game is pretty much just wanking material for war nerd spec ops wannabees.

I can never "get in" to this sort of "delta tango unsub rainbow six tango mango" war sim stuff anyways. Maybe I grew up around too many part-time army guys who figured they were Rambo every tuesday from 6-8PM. The jargon annoys me, and I hate having to carry a "squad" of AI dipsh*ts. Also, they could have rendered some new environments to be "middle east", instead of reusing the "war torn germany" maps from the earlier COD games. I could forgive all this if the game itself was fun.

Between this and BioShock, 360 owners are really showing themselves to be a bunch of low IQ drunks, easily impressed by flashing colors. They should just a random fight sequence from one of the Blade movies on a video loop, and release it as a 360 title. Throw in some quotes about how many pixels it is, and how much resolution it has. This impresses morons, even if they don't really understand it. It would make GOTY.

I hear BioShock is nominated for umpteen zillion awards. Sheesh, what a self masturbatory waste of time video game awards are. Hooray, shooting the same enemy in the same rooms over, and over, and over. Please keep innovating, XBox developers.

For the record, I liked Halo 3 a whole lot. No invisible walls, and plenty of in game variety. I'm pretty sure all the people out there calling it overhyped are the same type of folk for whom anything sufficiently popular must be uncool, the type who constantly need to remind you that they're "totally different" - just like everyone else. Apparently it sucks because they sacrificed about a hundred lines of vertical resolution, in favor of a second framebuffer. I doubt the average gamer knows what double buffering is, let alone the lighting effects it makes possible, but if Halo 3 is running in a slightly lower resolution, surely it is teh suck.

Unit Tests for Old Code and Crazy Dog Ladies

So my brother knows this crazy dog lady, who lives in a 100+ year old farm house, and has decided to augment her dog collection with 8 or so teenagers. Of course she needs more space to make this happen, so she brought a contractor out to ask about putting a bathroom into the shed, to make it living space - because when I say crazy, I mean crazy.

She brought out a contractor, who no doubt looked to her with that mix of bemusement and confusion one gets when one wastes half a day to scope a monumentally stupid task, and asked her, "where do you think it's going to drain to?". She answered, "I don't care, a pipe out the back or something."

So here I am sitting at my gig in front of a mountain of code, some good, though a lot of it is that overbuilt sort of "there's lots of words, it's extra double good" type of code all developers write out of ego.

But the problem, as it were, is a big push from management towards unit testing, and the old "code coverage" buzzword is getting tossed around. Once we have 80% or higher code coverage, we will have crossed the "bug count" singularity, and our codebase will implode upon it's own awesomeness and form a new supermassive black hole, I don't know.

I'm long familiar with unit testing from a functional design, creating the tests first and building the code around that.

But I'm sitting in front of code with epic method calls, you know the type - "private void DoAllTheTasksInOneCall(string[] EveryParameterPossible, out int ReturnValueBecauseImAMoMo)". Of course, the definitions of the parameters, the return value, and what "AllTheTasks" means are nowhere to be found.

I need to pass or fail this thing, but I have no idea what it does. Like any public school teacher, though, it would behoove me to pass everyone, because a failure would make me look bad, not so much the original coder.

So, my question is, Where do you expect me to put the drain? Yeah, I could stick a pipe out the back but that just makes for a yard full of poop.

To the C# MSDN globetrotter all-star geeks: how do you approach unit testing legacy crap? In particular, how do you deal with private methods? I can think of a few approaches, each with pros and cons. We're on VStudio 2005, TFS, and my coworkers have downed the KoolAid double fisted and it is the most amazing IDE money can buy.

1) Run the built in Unit Test wizard, and let it codegen a private accessor. Pros: I get a private accessor, and I'm always thrilled to have MSDN put sourcecode in my projects with a dire warning that I never, ever, ever, ever, ever dare touch the code.

Cons: Well, if I refactor my project, my accessor is junk now. Why can't VStudio rebuild this accessor as a prebuild step in the unit test? Who knows. All my objects will be wrapped in these accessors, so I'm testing them - and not the code itself. If I want to mock objects with RhinoMocks, or it's ilk, I have to mock the wrapper. Why should I be unit testing autogenned dog crap? Ultimately, I'm indirectly testing private methods, and that's never preferable.

2) The [assembly: System.Runtime.CompilerServices.InternalsVisibleTo] attribute, which allows me to simply call my class directly in the test project. Cons: well, intellisense doesn't seem to work in the test project but intellisense is pretty much a crutch for babies anyways - at least I'm creating and calling the classes directly. There's a few more mouseclicks and copy/pastes to sign the test projects binary, and embed it's PublicKey in the attribute, or else any joker could be instancing internal classes, and that will not do.

The problem with these approaches is they don't really seek to solve the real problem. Both assume I know what this object does in the first place.

My preferred option, the Holmes on Code approach:

3) Refactor, redesign and rewrite the code. Tear this all out, it simply has to go. We're going to knock down all these classes, come in here with a new MVP model, and simply will not quit until everything is encased in 8 inches or more of spray foam. We'll bring in the experts to properly size and install a new septic system. It will be expensive, it will take a lot of time, but it will be done right the first time.

This takes time. I'm talking about going back to the functional design (creating it if it never existed), and in many cases starting over. But how else do you do it? Just throwing an accessor onto some void method, calling it, and assume that if I didn't trap an exception everything is hunkydory?

Of course, refactoring and rewriting carries the largest risk of introducing new bugs, but, it leaves the project with better, and more testable code which can then truly become more robust. But it's a hard sell when QA finds issues with code that worked before, and you try to tell them with a straight face that it "works better now" despite all the crippling new bugs. So I'm either explaining my 20% code coverage, or why my new "100%" tested code, doesn't work as well.

I miss the days when unit testing was something only real developers knew about, back before Microsoft decided to borf up a sloppy knock off of NUnit and hotglue it to their "one size fits all" development environment.

Now "unit testing" is a buzzword in the "agile XTREMe 2 tha max" canon, and in danger of becoming a massive timesink. Good code is testable, but simply making bad code testable doesn't make it good. Folks sure seem to think so.