Friday, April 29, 2011

Developers need to understand licensing too

I am not a DBA. Well possibly by very loose standards, but I'm not professionally or personally responsible for any databases that contain any valuable data.

However I do work with databases on a daily basis and that means I need to be aware of licensing 'things'. Often I get access to accounts with privileges for various dictionary views. And there are some views and tables which can only be queried after supplying large amounts of folding paper to your friendly Oracle reseller.

That's one of my dislikes about Oracle. I'd be much happier if those views and packages which require the diagnostic, tuning or whatever packs were secured by some additional role to make them EXTREMELY obvious.

And development and test databases are another thing. It's all very well pretending things can be covered by OTN licences, but some things aren't. Throwing up another database for testing isn't necessarily covered by your existing licences. And what about that readonly standby ? Or that one on the VMWare server ?

Anyway, I've got my diary booked in for Mogens Nørgaard talking to us Sydneysiders about the joys of Oracle licensing in just under two weeks. If you've got any questions you want to pose (apart from why are they so bleeding expensive), add a comment.

I'd like to know more about Amazon Oracle pay-by-the-hour, due in the next couple of months, but I think we'll all have to wait for that.

Downtime, SLAs, compensation and other fairytales.

My brother-in-law in England has a beautiful framed photograph in his living room. It is of a small cottage with some trees just as a small sprinkling of snow has fallen on it. It's something that wouldn't look out of place on a Christmas card. But it isn't a commercially bought print. It is a photo that he took when chance put him in the right place at the right time. Plus he knows a thing or two about cameras and was able to do justice to the scene.

It is all the more poignant because he took the negatives in to a photo shop to get some more prints and the shop lost the negatives. He would have got a few pounds in compensation for that loss. He'll never see the negative again though. For youngsters, cameras used to use film, which was processed into negatives from which prints were made.

There were some places where you'd drop off the film, they'd send it off somewhere else to be processed, who'd deliver it back to the shop where you'd get the prints. Like a 'cloud' service, you never really knew where the film actually went. Keen photographers generally went somewhere which did their own development on the premises and things were less likely to go wrong. At the really cheap places, if something went wrong, you'd get a replacement film, blank of course.

This isn't about 'cloud'. It is about compensation. You don't get the photos back. Having an SLA promise you 50% off your month's hosting fee of $20 if they have excessive downtime is pretty pathetic if you've lost $1000 in business.

If you are paying for a service, whether it is power, telephone, data storage or CPUs, you are paying less than the service is worth to you. [If you are paying $50 for something you only value at $25, then your boss's nephew probably owns the supplier.] The provider has costs too, so the benefit (profit) to them in having you as a customer is less than the benefit (avoided loss) to you in having them as a supplier.

If you own it, you have to look after it.

Monday, April 25, 2011

In which I apologize for the Amazon AWS Outage

Honestly, I didn't mean it to happen.

I'm a (very) occasional user of AWS. And it is one of those businesses on the Internet where I pay using a pre-paid Visa card. Its one I have to remember to charge up every so often. And, in my defence, I've been a bit busy the past couple of weeks.

On April 18th I got an "urgent" message from Amazon saying that their attempt to charge me $0.09 for March usage had failed. And that if I didn't rectify the problem, they might suspend my account. But I was still busy so I didn't get round to recharging the card.

On April 21st, I got an even more urgent message from Amazon about the 9c I owed them. My account was going to be suspended. That was sent about 2:30 am PDT. Apparently that 9c was pretty important as Amazon then had its tummy ache.

I have since recharged my card and today got a message that Amazon had successfully processed the 9c charge. Hopefully all problems will be over. If I had known that my 9c would bring down half a data centre, I'd have made sure I was charged up. I'll try to make sure it doesn't happen again.

Thank you for your patience.

PS. Some 'scientists' may start talking about the difference between 'correlation' and 'causation' and the lack of any technical analysis in my reaction. But I remember reading "For want of a nail" in primary school so its quite logical to me ! Besides my horoscope said that coincidences don't exist, it's all part of a larger design and don't trust Aquarians.

PPS. For a more serious take on cloud storage, try Joyent's blog (link via Robert Young at "Dr Codd was Right" )

Saturday, April 02, 2011

How COLLECT displaced User Defined Aggregates

A recent PL/SQL Challenge question focussed on a user defined aggregate. Not a lot of people got it right, and one reason is that they have never been particularly popular. Tom Kyte's STRAGG function (dating back to 2001) was probably the best known usage. But they are basically complicated and a bit ugly.

When 10g came along, it added the COLLECT aggregate function. That meant that the old style user defined aggregates were pretty much redundant. It is a lot easier to aggregate up using COLLECT and then pass the collection to a regular function. 

It is also more flexible. As an example, since the user-defined aggregates could only have one parameter, it wasn't possible to pass a delimiter to the STRAGG function. This implementation, based on a 'regular' function overcomes that:


create or replace function coll_to_delimited 
  (i_coll_type in tab_char, i_delim in varchar default ',')
  return varchar2 
  v_ret varchar2(32767);
  FOR i IN 1..i_coll_type.count LOOP
    IF instr(i_coll_type(i),i_delim) > 0 THEN
          'Delimiter present in string:'||i_coll_type(i));
    END IF;
    v_ret := v_ret || i_coll_type(i);
    IF i != i_coll_type.count THEN
      v_ret := v_ret||i_delim;
    END IF;
  RETURN v_ret;

select coll_to_delimited(cast(collect(table_name)
                             as tab_char),'|') 
from user_tables;

I've even included a check to indicate where a string already includes the delimiter. I don't bother to include a specific check on length as that would be an error anyway. 

Generate an SQL replay file using TKPROF

Handy hint. You can use TKPROF's "record" option to extract a 'replay' script from a trace.

SQL> exec dbms_monitor.session_trace_enable(binds => true);

PL/SQL procedure successfully completed.

SQL> create table trace_test (id number, val varchar2(20));

Table created.

SQL> insert into trace_test values (1,'abcef');

1 row created.

SQL> insert into trace_test values (2,'xyz');

1 row created.

SQL> disconn
Disconnected from Oracle Database 10g Express Edition Release - Production

Then switch to the udump directory and find the trace file you created. 
In my case it was xe_ora_3972.trc

tkprof record=rec.txt trace= xe_ora_3972.trc output= r2.txt

TKPROF: Release - Production on Sat Apr 2 10:02:28 2011

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Then you can rerun the output from your record.

H:\oraclexe\app\oracle\admin\XE\udump>more rec.txt
BEGIN dbms_monitor.session_trace_enable; END;
BEGIN dbms_monitor.session_trace_enable(binds => true); END;
create table trace_test (id number, val varchar2(20)) ;
insert into trace_test values (1,'abcef') ;
insert into trace_test values (2,'xyz') ;