The MIL Signal: Mad Scientist Lab
Spending years playing nice, developing software with in the parameters, only gets you so far. Like being a kid with the blocks it's more fun to try different things and see what works. Taking inspiration from the various technologies and mashing them together to create innovation inspires new and cool. The prototype of new solutions continues over and over in our lab, setting off the malfunction indicator lamp until the beast is finally unleashed.
For the longest time I had trouble grasping what a Web Service is. I read the articles in the trade publications but as typical by journalists they speak of buzz words and hype and not of substance. They also like to use WS for Web Services when I already use WS for WolffySoft.
Dr. Marty Kalin of DePaul University spoke to Alumni during DePaul's Silver Anniversary celebration. For 16 months DePaul is hosting an Alumni focused series of events focusing on technology and reconnecting Alumni with the programs currently underway at the school. While DePaul has been around for a long time, this marks the 25th anniversary of the first Computer Science degree awarded to students.
As a Master's Student in Distributed System my Academic Advisor was Dr. Clark Elliott. Dr. Elliott was also the in charge of the Distributed Systems concentration. Due to an injury from an auto accident, Dr. Kalin took over the position. Of course I was interested in attending this presentation as I would finally hear about Web Services from someone who wasn't all hype. What follows, is my take on the technology as I understand it from this presentation.
One of the main technologies in use now is the Web Application. In this type of application the client is a web browser (let's say Netscape) and all information is done over on the server. The application itself is "web" because it makes use of HTML over the HTTP protocol. HTTP being the protocol of the world wide web. To make the web application interactive with the user other technologies such as ASP.NET, Java and mod-Perl are used. So in this example you are restricted to using a web browser as the client in your application and the responsibility is primarily on the server.
A Web Service differs from a Web Application in two areas. First, while the client piece can be a web browser it shouldn't be. In a Web Service the client is an actual client piece of software. Instead of having to mix technologies such as Perl and ASP, you have the ability to pick one set of tools and develop your client the way it needs to behave. The second difference is that while the HTTP protocol is used, instead of HTML a service will use XML. Web Services are typically developed on top of existing technologies, such as IIS or Apache for the server side to deliver the service.
There are several different standards on how web services should communicate with each other. One of these is the Simple Object Access Protocol (SOAP). One of the drawbacks to using SOAP though is the bloat added to a network communication by the additional XML required around the parameters. While larger documents aren't effected as much, smaller request such as current weather could be as much as a 10 to 1 ratio of markup compared to actual data.
One area of discussion that I didn't get a lot of in depth details on is the WSDL document. This is an XML file that specifies the data types, messages, port types and bindings of the service. Simply put the WSDL is the contract clients will use to talk to the web services. To me, this sounds similar to an IDL file used to create distributed components using COM, CORBA or RMI. I can't see why a client will use one of these at run time so I suspect once the design face is completed the WSDL isn't as important. But that's my take on it.
Finally, the primary objective is to provide large automation for the user. If an application isn't going to provide automation of information or resources for the end user, it doesn't make sense to develop it as a web service. The thing to keep in mind is that unlike a web application the web service could make use of multiple servers to render information for the end user. This means more responsibility is shifted to the client. In addition, the key work to web services is Modularity.
It's not in depth but it's a start in the right direction.
(From the winter 2009 DePaul Magazine)
CDM Tech Security Programs Validated by National Agencies
The need to protect personal, financial and critical national security information has led to the growth of tech security degree programs like those offered by DePaul's College of Computing and Digital Media (CDM). (CDM is formerly CTI: computer science, telecommunications and information technology). The efforts in information security offered by CDM have been recognized by the National Security Agency (NSA) and the Department of Homeland Security (DHS/HLS) as they redesignated DePaul as a National Center of Academic Excellence in Information Assurance Education. Fewer than 95 academic institutions in the United States are classified as a Center of Academic Excellence by the two government entities. Originally designated in 2005, DePaul was successfully re-evaluated against more stringent criteria and will now be included in the program until 2013.
To aid its efforts, CDM also launched a Security Advisory Board featuring IT security professionals from a major medical center, a top accounting firm, a risk assessment and compliance firm and a leading consulting house. The board will work closely with the faculty and administrators to develop curricula, advise on current security and information assurance trends and tailor courses to help students meet the needs of the marketplace.
DePaul's degree programs in tech security were among the first to launch in the Midwest. CDM also offers a bachelor of science program in information assurance and security engineering that teaches students the fundamentals of information security and security engineering, security infrastructure design, implementation and the impact of security requirements on a business operation.
|10BaseT||An Ethernet standard that uses twisted wire pairs.|
IEEE physical layer specification for 100 Mbps over two pairs of Category 5 UTP or STP wire.
Printed circuit board that plugs into a PC to add to capabilities or connectivity to a PC. In a networked environment, a network interface card (NIC) is the typical adapater that allows the PC or server to connect to the intranet and/or Internet.
The part of a network that connects most of the systems and networks together and handles the most data.
A binary digit. The value - 0 or 1-used in the binary numbering system. Also, the smallest form of data.
To cause the computer to start executing instructions. Personal computers contain built-in instructions in a ROM chip that are automatically executed on startup. These instructions search for the operating system, load it and pass control to it.
To receive a file transmitted over a network. In a communications session, download means receive, upload means transmit.
IEEE standard network protocol that specifies how data is placed on and retrieved from a common transmission medium. Has a transfer rate of 10 Mbps. Forms the underlying transport vehicle used by several upper-level protocols, including TCP/IP and XNS.
A 100 Mbps technology based on the 10Base-T Ethernet CSMA/CD network access method.
The ability of a device or line to transmit data simultaneously in both directions.
Data transmission that can occur in the two directions over a single line, but only one direction at a time.
Hardware is the physical aspect of computers, telecommunications and other information technology devices. The term arose as a way to distinguish the "box" and the electronic circuity and componenets of a computer from the program you put in it to make it do things. The program came to be known as the software.
The device that serves as the central location for attaching wires from workstations. Can be passive, where there is no amplification of the signals; or active, where the hubs are used like repeaters to provide an extension of the cable that connects to a workstation.
A Local Area Network (LAN) is a group of computers and associated devices that share a common communications line and typically share the resources of a single processor or server within a small geographic area.
A motherboard is the typical physical arrangement in a computer that contains the computer's basic circuitry and components.
A system that transmits any combination of voice, video and/or data between users.
|NIC (Network Interface Card)||
A board installed in a computer system, usually a PC, to provide network communication capabilities to and from that computer system. Also called an adapter.
A notebook computer is a battery-powered personal computer generally smaller than a briefcase that can easily be transported and conveniently used in temporary spaces such as on airplanes, in libraries, temporary offices, and at meetings. A notebook computer, sometimes called a laptop computer, typically weighs less than five pounds and is three inches or less in thickness.
A pathway into and out of the computer or a network device such as a switch or router. For example, the serial and parallel ports on a personal computer are external sockets for plugging in communication lines, modems and printers.
|RJ-45 (Registered Jack-45)||
A connector similiar to a telephone connector that holds up to eight wires, used for connecting Ethernet devices.
Instructions for the computer. A series of instructions that performs a particular task is called a "program". The two major categories of software are "system software" and "application software". System software is made up of control programs such as the operating system and database management systems (DBMS). Application software is any program that processes data fro the user. A common misconception is that software is data. It is not. Software tells the hardware how to process the data.
1. A data switch connects computing devices to host computers, allowing a large number of devices to share a limited number of ports. 2. A device for making, breaking or changing the connections in an electrical circuit.
Growing up in Chicago's inner city wasn't the most fun as gangs kept most of the kids safely tucked in their homes once the sun went down. As a kid Mike spent his time day dreaming about being a TV star or working on cars as most of the neighbors did in the alley. Computers were the last thing on his mind until sixth grade when he became the owner of a Commodore 64 computer system, complete with printer and one really, really huge disk drive. Mike was able to create several, non-mind blowing games using the Commodore 64's sprite and ASCII art capabilities. By eighth grade Mike had developed a few 'spaceship blasting enemy' games of which only the fourth one was cool.
Mike's high school didn't have a computer program so in boredom he drew wolf characters which became the root of his nickname. Between sketches he studied physics and calculus with the thought of going into Aerospace Engineering for NASA. During Mike's junior year the family's house caught fire after an arson attempt by a local street gang. The house was the sixth house on the block to be set on fire in a three year period and it provided an opportunity to start living a new life on Chicago's northwest side.
"Wolffy", a name he earned in High School based on his artistic abilities, went on to DePaul University where he majored in Physics and was well underway to completing his three year classroom training before going to do field work. Two weeks into his freshman year Mike discovered the world of Pinball after being tormented by a game called "Funhouse". Many boring physics and chemistry classes were secretly replaced with pinball study hall at the student union.
The coursework became more interesting when a physics and calculus course featured planetary objects and putting satellites in orbit. But then came a course in E&M. Mike realized that this level of work wasn't holding his attention. Noticing that he enjoyed generating the computerized lab reports and collecting measurements, Mike switched over to major in Computer Science. For his two years in the physics program Mike takes with him minors in both Physics and Math.
Under Professor Henry Harr, Mike quickly learned and adapted to the environment of the computer field, at least academically. Following graduation, he was hired to start in the up and coming client-server group for Delliot and Touche's, DRT Systems. To start his employment he worked on a six month contract for a downtown Chicago bank. That job was incredibly boring and Mike quickly moved on.
He ended up at a Deerfield based computer security company where he met his field mentor Dave Sowinski. Dave showed Mike how to handle the managerial politics involved in developing good software. Together the two created a data cryptography product that was (at the time) the most advanced to date. In 2000, Mike return to DePaul University to get his Masters of Science in Computer Science.
Working under Dr. Clark Elliott, Mike studied distributed technology and continued to study various topics including Business Object Architecture, server side web applications, distributed architectures, server development and distributed databases. All technologies Mike hoped could be used to implement a security services company that would rival current application development companies.
Mike graduated with his master's degree in June of 2002. Since then he has focused more on server side security and distributed data in addition while taking breaks to work with local pinball enthusiasts and satisfy his passion for cars. When he isn't doing any of those Mike can be watching hockey, bike riding, attempting tennis, enjoying an American road trip, and at times relaxing in second home of Orlando, FL.