[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: FYI ... An article on CIM,Policy Framework WG and modeling from Network Computing


A network information model, meaning a model of topology, connections,
virtual circuits, VPN partitions, and other relationships between network
elements, belongs in an information tier where it can be accessed by any
authorized business process in the enterprise. This can be implemented with
a directory, an OODB, or other data storage & access technology. For an
explanation of information tier concepts in distributed computing and
network mgmt, please see:


As for modeling protocols, at least as pertains to network and service mgmt
(my interest), I don't think protocol modeling is critical if one takes a
more modern approach and minimizes their use. At this point in time, the O-O
paradigm has subsumed distributed computing. What needs modeling are object
interfaces, which are then made available by means of a very small number of
protocols that enable transparent distributed computing, such as the IIOP,
Java RMI, or SOAP. These object interfaces are modeled with UML. I favor a
move away from the new-protocol-per-problem approach traditionally taken by
the IETF to a new-interface-per-problem approach, which I believe fits
better into the new  world of O-O distributed computing, is easier to
understand and manage, and definitely makes life easier for the software
developers who just want to invoke object interfaces.

In a perfect world, management of a network should take only two or three
protocols. Not considering end users of services and thinking only of
management systems (computers) and network elements, there are three
possible combinations for interacting:

NE to NE -- signaling protocol
NE to computer -- management protocol
computer to computer -- distributed object protocol

It is difficult to see how signaling protocols could be replaced by object
interfaces, given the need for extreme speed and efficiency. However, on the
issue of management protocols, I recently saw a case where a network
equipment vendor is preparing to offer an optional CORBA interface in place
of its SNMP interface. As the new network elements become more
sophisticated, it seems only logical that their management interfaces will
start to be implemented more often as general distributed computing

The above argument is heavily laced with a network mgmt perspective, but I
think you might see how this idea can apply more generally.

Kirk Shrewsbury

-----Original Message-----
From: owner-nim@ops.ietf.org [mailto:owner-nim@ops.ietf.org]On Behalf Of
Tom Scott
Sent: Wednesday, September 06, 2000 5:47 AM
To: NIM List
Subject: re: FYI ... An article on CIM,Policy Framework WG and modeling
from Network Computing

Andrea and other Friends of Comprehensive Modeling:

An interesting article. Given all the groups, initiatives and models
discussed in it, where would NIM fit in?

Does anyone have an inventory of models? What has been modeled already
and how are the models intended to be used?

Specifically, who has modeled protocols -- not particular protocols
(although that would be a valuable byproduct) but protocols in
general? A positive answer to that question might save me the work of
adapting Gouda's procedure-oriented APN to an object-oriented model.

But protocols are only part of the Big Picture. What else should a
comprehensive model comprehend? Management, services, network
resources, information/data resources, protocols, and what else?

Tom Nelson Scott             Vedatel Co
1411 Sheffield Dr.           Bowling Green OH 43402
"In IP We Trust"   "Java Rules"   "E Pluribus Unix"

-------- Original Message --------
Subject: FYI ... An article on CIM, Policy Framework WG and modeling
from Network Computing
Date: Tue, 5 Sep 2000 01:07:54 -0700
From: "Andrea Westerinen" <andreaw@cisco.com>
To: "Nim@Psg. Com" <nim@psg.com>,"Policy@Raleigh. Ibm. Com"

Feature: Management Standards Come Together
(September 4, 2000)
By Bruce Boardman
Will management standards deliver on their promise of true
interoperability? Standards groups are collaborating to provide an
overarching framework with buy-in from all the major industry
vendors. Can they pull it off? Here's our take: