Catching up with my blog reading this morning, I came across this piece. In particular, to quote
When building an ontology for a particular application, it is important to know how this ontology will be used. If the ontology is used to support logical inference, it is important to know the kind of logical inference that the application that will need to support it; is it within the capability of a target reasoner?
With my current level of ignorance I don't feel qualified to answer it, verging on being unable to understand the question. I'm tempted to put a page of questions up on the W3C wiki, just asking the questions I have so far. My diagram at this entry shows my current confusion. I'm in need of a piece of prose that links tools and files together with common sense. I have a nagging feeling that rdf's assumptions about an open world are a possible source of problems for my pragmatism. Give me a tool that lets me shut that opennes out for a while, until I gain confidence in what I'm authoring!
I'm currently trying to generate a data file. A few statements about a couple of books to exercise the ontology. For many of the properties, I've had to go back and modify the definitions, about which I'm not bothered, though it doesn't help when I can't really check that my facts are aligned with the ontology in any way that I find useful. I'm going to have to dig into a reasoner before long. Which one to choose! I'm making more use of pellet these last two days. My error list (previously ignored) has now shrunk to the point where I'm prepared to ask about them. Some 'errors' (I somehow think that isn't the right word in this domain) are just weird.
Came across this site today. I've been meaning to have a look at Ruby for some time. Looks like this could be a way to do it.
Keywords: owlComments (View)
Return to main index