Finally, Some Success

Over the last few days, I’ve attempted to solve a bug relating to WordPress’ behavior of pinging the PRC services even on drafts and edits. There is a plug-in for resolving this. But, if the user doesn’t have this plug-in, the result is multiple entries of the same thing on the Updated Blogs section of the new Weblogs.us site.

However, if the article URI is the same as some previous URI in the database the ping can be ignored (no need to go out and fetch the RSS feed). Now that this works, I will test a few refinements, which will result in a slightly heavier load (possibly neutralized by sending less data) for the database server, to reduce the load greatly on the Apache/PHP server. After this optimization, I will continue writing the code for the listings system.

-John Havlik

[end of transmission, stay tuned]

Updated:

Why Standards Matter

Why Standards Matter: Part 1 of a quasi-treatise rant on web standards.

Standards are set of rules, or guidelines that dictate what everyone who they affect should comply with to provide the most uniform quality product available. Every industry has standards, including the internet. The only difference is the repercussions caused by ignoring these guidelines. In most industries if a company doesn’t follow a standard typically a regulating body will fine or even shut down the location. Or, even worse, the customers may stop purchasing the company’s products, granted that the consumer is intelligent enough to carry out such actions. Many industries are pressured to follow the guidelines approved by the International Standards Organization, that’s where the ISO 9001 comes from, or .ISO for disk images. For the internet, there is the World Wide Web Consortium.

The W3C writes and maintains the specifications for HTML, xHTML, CSS, DOM, XML and a variety of others. Standards for the web are needed due to the mammoth number of individuals (some intellectuals, others may as well live in a zoo) that write code that interacts with an end user through the web. This includes everyone that writes a webpage, to those working on Mozilla Firefox. Because no two individuals are alike, typically people will have different opinions and ideas on how a particular object should be represented. In web browsers this resulted in Microsoft Internet Explorer rendering a HTML document differently than Mozilla Firefox, or KDE Konqueror. Instead of coding a website once, and displaying correctly on every web browser, the developer has to test on several and work out any ‘bugs’ due to rendering discrepancies. As humans are inherently lazy creatures, many individuals met and formed the W3C to alleviate this situation by laying a common framework for everyone to work off of. Standards are great, when everyone follows them to the ‘t’. Otherwise, once egos, or other forces cause one individual to stray from the standards unforeseen consequences may result.

Picture little Jimmy, he’s enthusiastic about learning how to create his own web page, but doesn’t know how. As no one he knows, or would ask, knows about web programming, he’s never heard of the W3C. Now Jimmy isn’t by any means nerdy, and doesn’t have the time to search Google for a good tutorial (and read it) on web programming. So what does Jimmy do? He views the source code of his favorite web site. What happens if this site doesn’t adhere strictly to the standards laid out by the W3C? Instead of only one non-compliant webpage, there are now two, and if Jimmy likes web programming there could be many more until he learns better. Jimmy may never learn, and that’s because the rendering of his pages only matters for his favorite browser (most likely Internet Explorer). From now on one will refer to this as the ‘Jimmy dilemma’.

The ‘Jimmy dilemma’ gets worse, as even with complete compliance with web standards, Jimmy may still end up mimicking a website incorrectly, creating invalid code. As for every educated web programmer in the world, there are 10 Jimmies. Only one practical solution exists to alleviate this situation. Some may see this solution as a bit harsh, but its better than the alternatives. Ultimately what needs to happen is browsers must refuse to display a non-standards compliant webpage. Practicalities may render this difficult as Microsoft has in the past implemented features into Internet Explorer with no regard to established standards. If implemented correctly by all parties, a more practical approach than enforcing the other methods, this method is 100% effective. How so? Observe the ‘Jimmy dilemma’ once more. If Jimmy were to produce invalid code, the result would be a blank page. Even by ‘hacking’ his way through the creation of the page, for it to display at all it must be valid code. Jimmy will either become fluent in a standards compliant web language (xHTML 1.1) or will give up, leaving others to take his place. Remember that everyone was once in Jimmy shoes, and therefore comprehensive, and easy to understand tutorials should be made available by the web browser’s publisher. These tutorials should be in the spirit of the Creative Commons licenses. What one means by this is the W3C guidelines taken straight from their web site are difficult and time consuming to comprehend fully, akin to what the Creative Commons calls “Legal Code” (It used to be something like “Lawyer Speak” but that’s since changed.)

-John Havlik

[end of transmission, stay tuned]

Firefox 2.0

Last night, Mozilla unleashed Firefox 2.0 into the wild. Naturally, your’s truly promptly installed the new version, before the updater said it was ready. One tested Deer park, Firefox 1.5 betas and alphas, but since then little need appeared to beta test 2.0 before the release. 2.0 is an excellent browser, with a few things one wishes would be fixed. Primarily the spell checking ability is great, just an accustomed word processor user in windows natively presses the F7 key to spell check, well to say the least, it doesn’t bring up the spell checker dialog. Under history a ‘recently closed tabs section appears’, a definite time saver for those of us who inadvertently click close tab, only to have to find the link again. Improvements in tabs include an minimum width so that the titles are still readable, and the extras spill off into scrolling tab bar. The only eye candy feature that would be nice for tabs would be a hover preview of a tab, one believes IE7 has this, but it’s not about being IE7.

Now for the fun compatibility, short-sightedness of web programmers. Since some don’t think that newer versions of a web browser will be as feature rich as predecessors nice pop-ups show up alerting one that the browser version is unknown. dAmn, the deviantART message network (internet chat), has some flash failure issue, not very important as it does work. Then there is WebCT, a web application that the U uses for professors to get content to students easily. WebCT generates this nice pop-up window explaining that the version of the browser one is using is not supported.

-John Havlik

[end of transmission, stay tuned]

Updated:

Some Things Work

This XML-RPC coding and debugging is a real pain. From investigation, one has learned that the script only works when called from a file under the main directory of a domain, or subdomain. That means, rpc.weblogs.us, beta.weblogs.us, and others will work if the script is called from the index.php, while something like weblogs.us/rpc, or beta.weblogs.us/rpc refuses to work. In refusing to work, the debugging script sends an XML-RPC ping, but receives no response what-so-ever from the server. So to test the script manipulation of the index.php file must occur to collect data, and then revert it back to view the site.

Speaking of things, in our discussion group for the History of Rock on Friday, we received our music observations essays back. Our TA then made comments on the generally seen problems in the essays, and then came what has to be the quote of the day. “…I like things, stuff and shit, but my things, stuff and shit are probably different than your things, stuff, and shit…” The moral of the quote is that one shouldn’t use the words things, stuff and shit in a paper, no matter how informal the paper is. One’s paper had no problems with this, but obviously others did.

-John Havlik

[end of transmission, stay tuned]

Tagged:
Updated:

ATI and AMD, To Be One Big Company?

Rumor mills keep spewing out ‘insider’ information that there is a planned ATI-AMD merger. Many have commented that this would hurt Intel and nVidia, as ATI reportedly makes the best chipset for the upcoming Conroe (Core Duo 2). With Intel and AMD as arch enemies, this chipset would likely never see the light of day. nVidia, as we all know, has helped AMD significantly with SLI and their enthusiast chipsets for AMD64 systems.

Reportedly on Monday, AMD and ATI will confront share holders on the possibility of a merger, naturally for their approval. Then the FTC will have their way with AMD and stall the merger for several months, if it even allows it. One speculation that I can offer is that the current lawsuits against Intel by AMD may be used as a cover in order to sneak the ATI merger through the FTC quickly. Other speculations on the net include the fact that AMD doesn’t have enough revenue to completely buy out ATI. Personally, I’d like to see ATI to continue to be a independent company.

-John Havlik

[end of transmission, stay tuned]

Updated: