SSL/https is widely used as is generally considered to be a secure method of encryption for the transmission of sensitive information across the Internet. But just how secure is it?
The concern for security is crucial in the present era of communication. In keeping with the rising feature in networking and the Internet specifically, the urgency for validation and encryption is going up at a fast pace. A lot of businesses and government establishments are no more ready to transmit their classified information and messages in non-coded form through an unsafe network. (IPV6 vs. SSL: Comparing Apples with Oranges) Encryption is being resorted to often by the present day online and offline operating systems. At any moment there prevails some data like passwords which is necessitated to be memorized and transferred between the computers. Encryption here is being resorted to in order not to make it understandable to the undesired recipients. To illustrate the operating systems such as the current version of Windows usually taken into memory the passwords entered not in the form of plain texts as it is entered but in form of numerical hash of the original, coded by one of many available techniques. This makes it difficult to access the information simply with browsing of the files or the registry. (Beginners Guides: Encryption and Online Privacy)
The encrypting has gained significance with the increasing use of Internet as a media for transfer of information. The person dependent upon the Internet resources to transfer the data across a public network needs to be assured of the fact that the data is sufficiently encrypted so as to make it not understandable to the casual eavesdropper. The common principle operating behind the computer data encoding techniques is to involve in the data encoding process that will have a numerical value to be used for encrypting the data that it transmits over the wire into a meaningless collection of characters. The key of encrypting will be settled on by the remote computer, of course depending upon the method of encrypting deployed, before the data is actually transferred then it is used by both the computers for encoding and decoding of the data only for another computer’s separate key value to decode and vice versa. Many techniques of encrypting are presently available and most commonly used technique in Internet transactions is Secure Sockets Layer. (Beginners Guides: Encryption and Online Privacy)
One of the greatest dangers to computer security is posed by those customers who without consideration or caution indiscriminately download software from different web sites. The software might not securely do authentication or interpretation of data. It might be having viruses like “Trojan Horses” that can assume various types of vicious web site addresses or destructive codes effected with the help of interpreters like postscripts in the client workstation or the Java Virtual Machine. In case the software contents are not correctly examined, possibility remains for this data to malign the programs that are resident on the client system. Alteration of data residing in the server is one of the dangers to a server. Users without having permission access to a server can alter, malign or remove data saved in the server. Disorganized validation systems and access control or the complete lack of these safety aspects might permit access to the server information by illegal users without any difficulty. (Secure Servers with SSL in the World Wide Web)
Besides faults in the server software might expose the system by making a security hole for illegal users. Danger posed by point-to-point security is reliant on the client and server characteristics to safeguard and afford a communication, which is secured. Currently, through the Internet, companies and clients perform business transaction through the World Wide Web. Customers will be parting with personal information like credit card numbers, account information through unsafe communication networks. The dangers of conducting business over the Internet can be reviewed as divulging information, manipulating information, damaging of information and non-availability of services. Various hazards related to these dangers are monetary, intimidation of life and status. An answer to these concerns is to make available a means for confidentiality, verification, uprightness and non-denial. Presently, a majority of the organization are secured by means of an Internet firewall that executes safety regulations on the basis of the network services allowed within the establishment. (Secure Servers with SSL in the World Wide Web)
URLs are compatible with various types of resources like FTP, HTTP, NNTP, TELNET, RLOGIN and so forth. Nearly all firewalls will allow the execution of just a part of these resources. Nevertheless, web servers extend these services free from the normal method, like the normal FTP routes. Hence, there are probabilities to circumvent the security firewall rule by utilizing the services obtainable from the web. This paper will review a planned scheme given by Netscape known as the SSL or Secure Socket Layer that is made to provide confidential and validate communications. This novel, up-to-date technology focuses on and offers an explanation for a majority of the problems that render managing a website an unsafe proposal for many organizations. (Secure Servers with SSL in the World Wide Web)
In view of the increasing data volume and significance of the transferable data over the Internet, the growing necessity for security of them is being increasingly felt in the present days. Presently every user of a public network resorts to transfer of various types of data ranging from email to credit card details everyday and want them to be safeguarded over the public network in their transfer process. (Secure Socket Layer- (www.windowsecurity.com) Internet browsing involves instant communication between the Web server and the Web browser. Generally, the data transferred between the server and the browser is perfectly comprehensible to everybody those are keen on deciphering it. These circumstances are quite similar to that of conversation over the telephone that is made possible if any body has access to the electrical connections between the two telephones at any point. It is possible of him to eavesdrop and even infuse unauthentic information into the conversation.
Eavesdropping of internet communication is not technically insignificant however, it is also not considered as significant as the rocket science. As a whole, however, people are having less reliance on the Internet security than that they need to be. It has been increasingly felt by over 50% of people sending the credit card number over the Internet involves the same level of security just transmitting it vocally by shouting from the roof top. However, it is misrepresented, since it is as secured as making a telephone call from an organizational switchboard. It is possible to listen in by a small number of people. In order to be listening in by a majority of people, when they really want to badly enough and have necessary technical skills. It has been however, observed that a majority of people have neither the required expertise nor the access. In order to provide online trading and e-commerce services it is required that the customers are to send in their credit card numbers over the Web media.
It is the business ethics to have enough precautions to make secure the relevant data of the customers and confidential information. The trustworthiness of the customers increases with the increasing confidence of the customers about seriousness of the businessmen to maintain such data. The traditional problem involved in encrypting is the problem of transmitting the password from one computer to another. In order to ensure effective functioning of the encrypting there is necessity of ensuring compatibility of keys used by the sender and receiver. Sending of key through Internet is not considered safer. The safest method available is encrypting. To overcome the limitations associated with the encrypting the solution is to resort to asymmetric encryption. The method involves different but compatible keys for encryption and decryption. With the initiation of communication two compatible keys are generated by the server. One known as public key is used for encryption only and it cannot decrypt. It is safer to send them over the Internet to reach the browser. When this key is captured by someone it may not be of much useful to him since it is used only to encrypt. (Secure Socket Layer- (www.windowsecurity.com)
The public key is used by the browser to encryption the data need to be sent to the server. The encrypted data then securely sent over the Internet. When the data is intercepted by somebody, he cannot make any sense of it, until is decoded with assistance of the private key. The private key is retained by the server to decode the data. The decoded data is taken as input for processing as usual. The vice versa is being followed up with data transfer from the server to the client. The introduction of public key/private key is fundamental for all modern secure communications methods and considered to be more forceful. This is however, not considered foolproof. It is possible to break the security by a person having adequate technical expertise and access to the network at hardware level. In view of this the SSL method with right configuration is considered perfectly sufficient for all commercial purposes.5In order to safeguard the data while in transit it is customary to adopt a practical SSL protocol covering all network services that use TCP/IP to support typical application tasks of communication between servers and clients. (Secure Socket Layer- (www.windowsecurity.com)
Communication over the internet passes through multiple program layers on a server prior to actually getting to the requested data like web page or cgi scripts. The requests first hit the outer layers. The high level protocols like HTTP that is the web server, IMAP — the mail server, and FTP the file transfer are included as outer layer protocol. Determination of the outer layer protocol that will manage the requests depends on the type of request made by the client. The requests are then processed by this high level protocol through the Secure Socket Layer. (How does SSL work? Detailed SSL – Step 1 Determine Secure Communication) a socket refers to the logical link between the client and the server and refers to the encrypting that takes place at a very low level of communication. It implies that there won’t have to be different methods for encrypting text, images, sounds, Java applets, etc. All the communication between the client and the server is encrypted in the similar methods. (Secure Servers) the Secure Socket Layer is a method for encrypting the transit data over the Internet. Its real significance lies in respect of data transfer in an e-commerce environment where it is increasingly required to transfer information like credit card information and other sensitive data. The SSL creates a Virtual Private Networking as a substitute for the traditional technologies of IP Sec and PPTP. (SSL Acceleration and Offloading: What Are the Security Implications?)
The main operations of SSL can be explained as follows: Server Authentication: Server Authentication permits a user to substantiate the server identity embroiled in any doubtful business dealings. This is achieved by employing a public key method that confirms the authenticity of the certificate of the server that has been approved by a reliable certificate authority. While sending confidential information like the credit card number, this utility confirms the identification of the server. Client Authentication: Client authentication permits a server to verify the identification of a user in the identical method as server verification. Client verification might be done by banks and Internet-based brokers to make sure that the transaction is made with the rightful user before executing secret dealings like purchase of shares or transfer of monies. Encrypted Communication Connection: SSL manages the method of encryption and decoding information sent between the client and a server. Information sent through an encrypted SSL connection stays private and free from intrusion guaranteeing that the data received is untouched and was not viewed by others. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
The SSL protocol was first introduced by Netscape in order to provide data security while on transit through HTTP, LDAP or POP3 application layers. (Secure Socket Layer- (www.windowsecurity.com) the initial version of the protocol was released in its crudest form during the summer of 1994 to be used in the Mosaic browser. Its V2.0 known as the second version was perceived as integration with the Original Netscape Navigator Web Browser and was released towards the end of 1994. Within the first year of introduction of Netscape Navigator, Microsoft introduced the Web browser Internet Explorer at the end of 1995. Microsoft brought out its Private Communication Technology (PCT) specification, after few months of introduction of Explorer. The PCT specification was first introduced in order to overcome the weaknesses of SSL 2.0. The SSL v3.0 was released by Netscape Navigator during the winter of 1995. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
Various writers have examined the SSL protocol suite, mentioning in unanimity that starting with v3.0, it is stable enough and devoid of any significant defects in the design. According to Wagner and Schneier who wound up their analysis that overall, SSL 3.0 gives exceptional safety against snooping and other indirect attacks. Even though exports-reduced methods present just minimal safeguarding of private information, SSL has hardly anything to do in this regard. (Heinrich, Secure Socket Layer (SSL)) the Internet Engineering Task Force -IETF tried to make SSL an international standard in May 1996. Similar tasks with the TCP and IP protocol standards were achieved by IETF. IETF at the beginning of the year 1999 names SSL as Transport Layer Security – TLS Protocol. The version 1.0 of TLS is considered to be an extension of the SSL 3.0. Presently, all the major Web browser applications and Web servers are compatible to SSL. This is being used as universal Web browser in transaction from ordering of books to electronic fund transfers. The implementation of SSL in Web browsers is very clear to the users with a limitation of https as a prefix to the Web address and an icon signifying secured connection. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
The SSL 2.0 is considered to be a real standard for cryptographic safeguard of Web http traffic. However, it has its own exceptions both in respect of the cryptographic security and functionality resulting in up gradation to SSL 3.0 incorporating several improvements. This new version of SSL will soon introduce the widespread deployment. The Transport Layer Security, introduced by IETF is also utilizing the SSL 3.0 as a base for their standards efforts. The SSL 3.0 thus endeavors to cater to the Internet client/server applications with a practical, widely applicable connection-oriented communications security mechanism. The SSL 2.0 had many security weaknesses that is attempted to be overcome by SSL 3.0. The SSL 2.0 is seen to have weakened the authentication keys to 40 bits in export weakened modes. (How does SSL work? Simplified SSL – About Secure Sockets Layer and HTTPS) weak Mac construction is used by SSL 2.0, even though the post encryption also combats attacks. It is quite visible in case of the SSL 2.0 that leaves the padding length unauthenticated, while feeding the padding bytes in the MAC in block cipher modes. This makes possible the active attackers to delete bytes from the end of messages. In case of the cipher-suite rollback attack, the attacker resort to editing of the list of cipher-suite preferences in the hello messages so as to induce both the endpoints to use a less strong encryption than otherwise would have been chosen. This flaw is considered as a limitation of the strength of the SSL 2.0 to least common denominator security and vulnerable to active attacks. Some of these weaknesses have also been found out by some others. Dan Simon specifically emphasized on the cipher-suit attack roll back. These concerns have been have also been emphasized by Paul Kocher and the PCT 1.0 protocol was examined and found out to counter some of these weaknesses but not all. (How does ssl work? Simplified SSL – About Secure Sockets Layer and HTTPS)
The goals of the SSL are to validate the client and server to each other by supporting to the use of standard key cryptographic techniques for authentication of the communicating parties to each other. SSL also resort to the use of the common application in authentication of the service clients on the basis of a certificate. The next objective of the SSL is to ensure data integration so that the data is not possible to be tampered with intentionally or unintentionally during a session. The third objective is securing of data privacy. The transit data between the client and server is required to be safeguarded from unauthorized capture and be decipherable only be the intended recipients. This precondition is essential for both the data associated with the protocol securing traffic during negotiations and the application data that is sent during the session itself. SSL is not considered a single protocol. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
In reality it is a set of protocols that can further be fragmented in two layers. One is to restore the data security and integrity, consisting of the SSL Record Protocol and the protocols designed to establish an SSL connection. This layer uses three protocols viz. SSL Handshake Protocol, the SSL Change Cipher Spec Protocol and the SSL Alert Protocol. (Secure Socket Layer- (www.windowsecurity.com) in order to comprehend the most widely accepted protocol for secured transmission of the data through the Web, it is crucial to know the relationship between SSL and other Web protocols at a very high level. The Internet architecture is consists of the layers of protocols that depend upon groundwork of protocols beneath them. To illustrate, the use of feet and legs analogous to this it can be observed that the feet provides the balance and traction and the legs provide the strength to walk and jump. The leg protocol depends upon and extends the function of the foot protocol. It is observed that the two protocols are intricately tied up; however, the foot protocol does not depend upon the leg protocol to discharge its primary responsibilities. The foot protocols are considered to be the basis on which other more functional protocols rely on. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
The primary Internet Protocols consists of Internet Protocol (IP) that route the messages through the networks; Transmission Control Protocol that ensures the reliable transmission of IP messages; Hyper Text Transfer Protocol (HTTP) that handles the interaction between the Web browsers and Web servers. The SSL protocol is considered to be an optional layer that is positioned between the TCP and the HTTP protocol layers. The aim of SSL is to cater to restore full security for multiple applications as an easily deployable and dedicated security protocol. Positioning of the SSL security protocol in between the TCP and HTTP layers, do not make considerable changes in the TCP and HTTP layers essential. The HTTP layer interfaces with SSL similarly to the structure it does with TCP. The TCP layer serves the SSL as it does to other requesting applications. The SSL can serve other protocols like File Transfer Protocol as an independent security protocol. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
The important suggested design for installation of SSL is non-transparent since this design possesses the maximum flexibility and scalability. Users can be traced through cookies and through choice by the presence of SSL accelerator log the client addresses to a syslog server. This cache of address of the client permits the ability to not just trace, but also continue another instance of sessions that have been closed successfully as well. In this manner the processing overhead is avoided, and increased bandwidth is available for even more SSL transactions. SSL accelerators can be mutually used between applications and content switches. The rationale of it known as nontransparent is that the source address of every packets deciphered by the SSL accelerator contains the source address of that particular SSL accelerator. (Introduction to Secure Sockets Layer)
From the context of the server, the requirement arrived from the SSL accelerator. A lot of customers faced difficulties in the beginning with this arrangement since they generally look for the client’s source address on the server. Many SSL termination gadgets continue to have the capability of transmitting the client’s source address to a syslog server for tracing it. But with cookies enabled, you will be capable of collecting minute details regarding the client. Apart from collecting information relating to IP address, cookies are able to trace the client’s movement in the Internet relating to the websites visited as also personal information. (Introduction to Secure Sockets Layer) benefit of SSL is that it functions independent of the application protocol. A protocol, which is at the higher level, can remain on top of the SSL Protocol visibly. (Freier; Karlton; Kocher, the SSL protocol) an important advantage of SSL is that it does away with the necessity of a client to be installed at the end user point. SSL eases a lot of the problems connected with IPSec, like alien firewalls filtering or blocking IPSec traffic. Especially, this is a vital question for traveling employees who are always at the site of customer or supplier, and inside their firewall, as also the incapability to execute several simultaneous IPSec sessions on one PC. (News & Analysis: Past, Presentation, and Future) When the SSL faces a request for a non-secure connection by the client, it passes through the TCP/IP layers and the server application or data. When the SSL receives a request for secured connection by the client, it starts a greeting for initiation of the secure communication process. (SSL Acceleration and Offloading: What Are the Security Implications?)
In this case the SSL require a secure connection prior to allowing communication to route through the TCP/IP layer which entails a non-secure request to return back with an error message for retrying them securely. The SSL utilizes a “handshake” protocol to settle and institute a session between the client and server computers. During such sessions authentication of identity is done through digital certificates, and the data integrity is ensured by the communicating computers through a hash algorithm like MD5 or SHA-1. A message sent to the server by the client computer known as Client Hello message initiates the SSL session. The response of the server is made through the Server Hello Message. The messages specifies different standard for communication with regard to the version of SSL, session ID for continuance from the previous session, the ‘cipher suit’ and the compression algorithm. The identity of the server is authenticated by the client with sending of the digital certificate containing its public key. In some cases, two way authentications are essential. (SSL Acceleration and Offloading: What Are the Security Implications?)
Verification of the identity of the clients by server is essential in addition to the client verifying the identity of the server. A good illustration in this respect is the case of Internet Banking. In such cases the server sends a client certificate request. The response of the client comes in form of two way authentication through own certificates. A key exchange message is also sent by the client with the pre-master secret that encryptions the public key of the server. The message is then decoded by the server with the help of the private key that belongs to the same key pair as the public key. A hash is sent by the client of the foregoing messages encrypted with its private key. The client’s identity then can be verified by the server after decoding this with the public key of the client. A message is sent by the client to clarify that server that subsequent messages will be encrypted with the agreed algorithms and at last a ‘Client Finished’ message is encrypted and hashed. The server responds with a message to tell the client that its subsequent messages will be encrypted and then sends a “Server Finished” message need to be encrypted. When the client is able to decode it the negotiation is successful. Client and server communications are encrypted using the keys and algorithms as settled down in the process of handshake. (SSL Acceleration and Offloading: What Are the Security Implications?)
Secured processing of transactions on the Internet implies the ability to transmit the information between the Web Site and the customer making the others difficult to capture and decipher. The security of the transaction processing is ensured by the Secure Sockets Layer, SSL that functions in collaboration with the programs and encrypting, decoding routines that exist on the web server and in browser programs like Netscape and Internet explorer interacted by the user. Even though the SSL certificate is attached with the fully qualified domain name, the certificate is linked by the Web servers to the respective IP address. This provides unexpected results when one tries to have more than one SSL certificate linked to the IP address. Irrespective of the domain one attempt to access, the certificate used for IP address, is the first one in the configuration file of the Web server. It is significant for the Web site owner since most of the budget related or free Web hosting services are reluctant to provide their respective IP address to one another. (How does ssl work? Simplified SSL – About Secure Sockets Layer and HTTPS)
Obtaining of a new IP address for an SSL certificate is considered to be a significant factor in extra pricing for secure hosting on the budget Web hosts and often increase the pricing of a full service host. It is essential to open individual accounts for each so as to get own IP addresses with the necessity for separate certificates for multiple domains in respect of the full service Web hosts. As the certificates are not linked to the IP addresses it is possible to move the certificate from one Web host to another. The businessman actually concerned about receipt of a secure certificate and ensured that a valid and current SSL certificate is used. The URL to use with creation of secure links is also to be specified. The SSL certificates are to be obtained from different certificate vendors and generation of the Certificate Signing Request is also essential. This creates association with the hosting company and requiring them for generation of CSR. With the acquisition of the CSR it is possible to order certificate from the SSL certificate provider. (How does ssl work? Simplified SSL – About Secure Sockets Layer and HTTPS)
On receipt of the SSL certificate back from the certificate authority, it is quite possible of the hosting company to install it for one. It is also necessary for one to be sure that an SSL Certificate is to be allowed by the hosting account. Moreover, provision of a unique IP address is considered as a primary element. It can be contacted directly if not documented on the Web site of the Web host. It is essential to understand the handling of the account about its own certificate and about the additional costs to be involved. The merchant/designer will be sure to call the secure pages using ‘https:// / / ‘ in their links after the web host installs the new certificate on the Web server. All elements on the page are required to use either a relative path without https or http://” so as to avoid browser message with regard to insecure items. Supplementation of the page items using a relative path will default to the same protocol at the time the page is displayed. (How does ssl work? Simplified SSL – About Secure Sockets Layer and HTTPS)
The dominant use of SSL is in HTTP server and other client applications. Nowadays, nearly all HTTP servers support an SSL session and SSL-enabled client server is offered with IE or Netscape Navigator browsers. (Secure Socket Layer- (www.windowsecurity.com) to offer data privacy, SSL employ a single shared key for both encryption and decryption called symmetric encryption. Compared to asymmetric encryption that employs a matched key pair, symmetric encryption is thought as relatively insecure; but due to fact that symmetric encryption is a lot faster and needs a smaller amount processing, that drawback is compensated to some extent. The usage of longer key gives a boost to SSL encryption; SSL can utilize DES, 3 DES, RC2 and RC4, with key length up to 168 bits. (SSL Acceleration and Offloading: What Are the Security Implications?) Though SSL is employed in several services, SSL protocol is primarily connected just with WWW.pages, because SSL guards the HTTP communication channel over the Internet. SSL protocol can be utilized to look after the transmission for any TCP/IP service. Next to WWW.accessing, SSL protocol’s most probable application is linked with email sending and receiving. With respect to Windows NT/2000/XP applications, SSL is basically used in the system of HTTP and SMTP server services that operate in combination with IIS.
Using these servers, a suitable request can be produced either by getting the certificate from one of the reliable CAs or with the user’s personal certificate using the Windows 2000 Server certificate services. It is feasible to acquire and install the certificate on an SMTP server supplied along with IIS or that offered in Exchange 2000, by following the similar procedure adopted for WWW.server.Since SSL is not built into an application, it sustains other application protocols. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions) a probable use of SSL in assisting other services is based on the feasibility of configuration of such a connection by the server. (Secure Socket Layer- (www.windowsecurity.com) With the change of protocols of the other applications, the SSL is not required to be updated. This entails the other protocols to concentrate on their primary goals and to rely on the SSL layer for security purposes. Moreover, the internal activity of the application protocols does not vary with the addition of security. It simply involves decision to use SSL. Such flexibility provides increasing scope of adoption of SSL in future. SSL has become the universal standard for securing transactions and information over the Internet which adds to its advantage in the market. SSL is considered to be reliable and proven protocol being supported by all major browsers and Web servers. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
On each occasion e-commerce dealing is done through the Internet, secret private information like credit card numbers or social security numbers are swapped between the buyer and the retail outlet and the right functionaries participating in the transaction procedure like a payment processor. This information is vulnerable to intrusion by illegal outsiders and malevolent assaults by computer systems hackers. These threats are protected against by SSL. Concurrently it even confirms or corroborates that the ownership of a website and guarantees that the collected information is given to the correct front office owner and the rightful functionaries concerned in the buying process. SSL is a vital necessity of doing dependable, competitive e-commerce transactions. A lot of knowledgeable buyers will not buy from web sites that do not have the minimum standard of security supposed to be having. Truly, sellers having SSL-enabled websites have chance of augmenting their business just by assuring the customers that their personal information and consumer data is secured. (Secure Socket Layer (SSL)- dynamic.com/downloads/marketing)
Web browser, when requested to transmit protected information to your server, may very much demand your organization to confirm that the information to be transmitted is what you are asking for. This way it can avert crook organizations accepting, say, credit card numbers for wicked uses. One’s site certificate aids this use. The site certificate refers to a declaration of one’s organization’s name, place and Internet domain. A Web browser, on its own, cannot believe a site certificate as anybody can create a site certificate. Signature by an organization that the Web server believes is what is anticipated by it. This signature will look like addition of some numeric information that is hard or unfeasible to build, and can unfailingly be tracked back to the signing agency. Majority of the Web browsers are designed to believe well-known certification authorities like Thawte and Verisign.
Additionally, the Web browser can also accept certificates signed by those organizations, whose certificate has been signed by a reliable agency. In simple terms, there is a string of faith starting from one’s site certificate to an agency that the Web browser believes. In order to lessen the probability for scam, the length of this string is cut short by some Web browsers. The browser cannot decline the connection automatically, even if your site certificate cannot be confirmed this way. In such case, the Web browser will ask the user how to proceed. The browser will typically show up the information present in the certificate, and ask the user whether to believe it or not. If the user is unaware of your organization, he will click “No” button, as he will not believe it; this may lead to losing a prospective customer. In order to avoid such situations, it is better to pay a moderate fee to have your site certificate signed by a reliable agency. As the server operator has the privilege to ask the browser to produce its testimonials, certificates function both ways.
This method can by made use of if you are using a Web server to dispense delicate commercial information among different parts of your organization. It is the responsibility of the server manager to give fairly thorough and trustworthy information about which signing agencies to rely on, as the server cannot stop and ask a user whether to allow a doubtful certificate or not. One is the signing agency for your server in case one is in a commercial setting. Then, the server needs to be designed to believe only those browsers whose certificates have been signed by the client. In small-to-medium sized organizations, this system is more than adequate. It is to be noted that though this process is difficult to set up, it is much better than shielding specific areas of one’s server through password. The password system does not decrease the danger of poking around one’s private server, as it cannot compel the data to be encrypted, but it can only stop casual hackers from eavesdropping one’s private server. (Boone, Secure Servers)
It is usually a long procedure to install site certificates. This procedure includes creation of a certificate signing application, sending out to a signing agency, and then installation of the signed certificate. Complete guidelines on how to do this should be made available in one’s Web server. However there is a difficulty with regard to the authors of these guidelines: they don’t keep in mind that the readers do not have a Master’s degree in cryptography. It is to be kept in mind that a secure Web server needs encryption only for communication of data between the Web server and a browser. It does not encrypt other communication software or the data piled up on its own hard disk, or execute any other method to protect the computer itself. There are many number of ways by which one’s server security can be threatened and the use of SSL cannot help in this regard. SSL, on one’s Web server, need not be used for other communications software. For instance, let us assume one is using online shopping software to record customers’ credit card numbers on the server along with their orders. How that information can be transferred from the Web server to one’s desktop PC? (Boone, Secure Servers)
Usage of Telnet or FTP for connecting the server can be troublesome. As these methods do not use encryption by default, information will be revealed to the Internet thereby opposing any advantage of using SSL. One can either get protected versions of FTP and Telnet that uses SSL or one can make the ordering software e-mail orders to one in encrypted format, like PGP. Just by using a protected server, do not take for granted all one’s security problems will get resolved at once. A secure server does not guarantee hold back access to any part of one’s Web server’s content. It only guarantees usage of encryption for access. Other methods along with SSL security are required to hold back access like username or password control and access by means of client certificates. (Boone, Secure Servers)
It is to be kept in mind that SSL does not specify the type of encryption to be used. Presently nine different encryption methods of differing sturdiness are understood by SSL. The best is IDEA – International Data Encryption Algorithm and this is what is to be selected if in case one is asked to choose. but, the encryption techniques that a browser can support cannot be controlled by one’s server. For instance, if a browser connects to one’s server and ask one’s server to use a weak encryption technique, the server will use the weak algorithm by default. The server should be designed to refuse these requests if the browser is sending one’s organization’s critical data. Normally it is not an issue as most of the browsers can accommodate reasonably strong encryption techniques. (Boone, Secure Servers)
While the validation and approval methods of SSL are okay for e-commerce sites but they are awkward for independent devices. For the following reasons, certificate-based verification is troublesome: Operating a Certificate Authority (CA) or for using a service provider like VeriSign, it is costly and tricky; With the CA’s public key, every machine is to be designed; it calls for computationally costly public-key crypto operations; When a endorses to B, B must verify back the certificate obtained from a is the correct one that it was trying to get. Even though the certificate signed by the concerned CA of a is useful, there is a scope for some one else to access the CA and give out a certificate to obtain the private key thereby deceiving B. into believing it is conversing to a genuine colleague. This is very much comparable to the invalidation problem. It is highly difficult to take out authorization for a device if it is robbed or compromised.
In order to take out authorization, it is important for each endpoint to check with the certificate revocation list whether the certificate it is offered with has been cancelled. This is similar to not so good requirement to organize all devices with the passwords of all of its probable colleagues. Password-based authentication, due to the requirement that every device has to be designed with the password of each of its prospective colleagues, is troublesome. In case if each device has to correspond with numerous contemporary applications or devices, large number of devices may not be required. Anyhow, an enterprise cannot direct decision on which device is allowed to talk to which application that is authorization, based on either certificate-based or password-based means of authentication in SSL. Individual configuration of all endpoints with the suitable authorization data is required. The requirement to design all passwords and authorization rules into all devices is an enormously demanding, if not unfeasible, administrative job for a usual enterprise device deployment with hundreds or thousands of devices and maybe with dozen back-end applications. (Why CDAP and Not SSL/HTTPS)
SSL as well as IPSec installations carry strong message algorithms, however, more number of IPSec executions carry strong volume. SSL encrypts over the Transport layer-TCP hence your application data continuation if safeguarded. While the IPSec is installed alongwith the IP Encapsulating Security Payload protocol, that is generally done, then the TCP and the primary IP header is also encrypted and safeguarded. IPSec secures against traffic examination whereas SSL is not able to do so. SSL is susceptible to some vicious data insertion assaults and some Distributed Denial of Service assaults. The inference at this point is that every institution must weigh the dangers and chances of all assaults prior to arriving at a decision to select IPSec or SSL. (News & Analysis: Past, Presentation, and Future)
Key length is acknowledged as the one of the most widespread dilemma with SSL. Keys of shorter length can be broke open which indicates that it can be guessed through trial and error method. All it takes is less than a day with a few latest computers to work out forty bit keys through sheer guesswork. Moreover, forty bit keys are being used currently in the worldwide versions of Netscape and Internet Explorers. An individual can ascertain if the version of Netscape possess International capability encryption by viewing the splash page at the time of launch of Netscape. It can also be verified by selecting Help/About. Linux systems delivered from within the United States, like RedHat, are contained in small key size due to limitation of exports. (SSL is not a Magic Bullet)
An added difficulty revealed by Mitja Kolsek of ACROS in Solovenia due to which CERT Advisory CA-2000-05 was enacted. It was found by Kolsec that Netscape Navigator took it that a later access to the same IIP address was part of the same SSL session, albeit by DNS spoofing, Navigator can be duped into logging into a separate web site, thwarting the system which is intended to assure that the name of the server is identical to the common name obtained in the server’s certificate. Microsoft had also experienced difficulties with SSL. IIS in its versions 4 and 5 will transmit the identical session numbers in cookies for simple HTTP and SSL connections, which is an additional hitch unearthed by Kolsek. Moreover, Internet Explorer would disclose passwords that were entered prior to that which were secured by SSL. Sample this: you have logged into a Web Server using a SSL, and thereafter entered your username and password into a dialogue box given by your browser software. The username and password are an element of HTTP’s Basic Authentication scheme, but were secured as the exchange of data was encrypted by SSL. (SSL is not a Magic Bullet)
However, in case you wished to visit another link within the same site but without using SSL, your username and password will be transmitted without any encryption, which can be tapped by any snooper. Microsoft has engaged consultants for looking at the security issues and remedial patches that takes care of both these dilemmas. Although SSL enhances the security of electronic business dealings over the Web, it is not an ideal explanation. Due to fragile encryption, unintended disclosing of classified information like passwords or session cookies, instances of erroneous identification coupled to fade the idea that SSL is a safe solution. However, the presently used SSL has a comparatively bigger and yet to be corrected weakness. SSL trusts on the Web browsers that possess a previously deployed list of top level Certificate Authorities. (SSL is not a Magic Bullet)
However, the browser fails to verify whether the Certificate has been cancelled. In case the owner of a certificate fails to obtain authority of the private key linked with that particular certificate, usage of that particular Certificate can no more be made and it must be cancelled. However, the present day browsers possess no internal system for renewal of certificates. Even though occurrences based on this situation are exceptional, it sometimes occurs. The answer does not remain to relegate SSL, but fairly to deal with the limitations in the manner it is used at present. Moreover, as an user of the Internet, to keep in mind that merely viewing the icon of a padlock in a closed mode does not indicate that your data is absolutely safe, but on the other hand that your data has been safely transmitted through the Internet to the rightful server. Apart from that, there are no assurances regarding security. (SSL is not a Magic Bullet)
Even though SSL employs faster symmetric encryption for privacy, it nevertheless produces a performance lag. This is due to the fact that SSL encompasses much more than the data encryption. (SSL Acceleration and Offloading: What Are the Security Implications?) Cryptography is a costly proposition. The expenses of installing an SSL can considerably reduce the speed of the Web servers, and thus, meddle with the Web site. The two pertinent cryptographic functionalities of SSL happen at the data transmission point. Bear in mind that data transmission point happens at the record protocol level in SSL. The SSL data are encrypted alongwith a digital signature of the Media Access Control -MAC is appended with every record transmitted. The data encryption and the record MAC signature functionalities are responsible for a majority of the cost at the time of data transfer. (Introduction to Secure Sockets Layer)
In the “handshake” process, the server and occasionally the client is authorized to use digital certificates derived from asymmetric or public key encryption technology. Public key encryption, though extremely safe, has a considerable negative effect on performance due to its enormous processor-intensive nature. (SSL Acceleration and Offloading: What Are the Security Implications?) Meeting of the SSL happen at the start of each SSL session demanded by the server. The session key exchange is responsible for the greater part of the cost. In order that the numbers of meeting is reduced, one method is to employ session restarting. This system ensures that the server builds up a cache of clients. However, this cache can pile up and be very big in case the server is handling a large number of traffic. In this situation, the memory and CPU can be engaged to its ceiling capacity merely to retain the session caches since a Web server might be managing numerous business dealings per minute, indicating probably hundreds of session restarting also. The consequences are a reduction of speed in the Web server in general, in HTTP and SSL transaction also, which could result in the server remaining idle for some time. This hold up or even idleness effects into missed incomes that could have been created at that period. (Introduction to Secure Sockets Layer)
E-commerce sites are vulnerable to SSL delays; hence customers may come across sluggish response and extended waits, which will end up in losing business. In order to tackle the SSL performance problem, hardware accelerator was the initial technique used. This is a card consisting of a co-processor that executes a portion of the SSL processing thereby reducing the load on the Web server’s main processor and can be plugged into a PCI slot or SCSI port. Several vendors including nCipher make SSL hardware accelerators. Normally, the single operation that is offloaded to a hardware accelerator is the RSA operation, which makes use of public key cryptography. This is in view of the fact that the symmetric encryption is much faster and there is no need for offloading; in reality offloading those operations can end up in reduced performance. (SSL Acceleration and Offloading: What Are the Security Implications?)
Thus, the server’s main processor executes the symmetric cryptography operations and the accelerator executes the asymmetric cryptography operations. With a hardware accelerator, the performance improvement levels that one gets differ from one vendor to another. Few vendors claim an improvement in SSL processing capacity beyond 500%. To enhance capacity, one more card can be added to the same server or dual cards can be installed to ensure high accessibility. Supplementary functions like key management are built-in in some cards. In contrast to server-side accelerators, other accelerators named network accelerators are planned to work with network switches and intervene and decrypt SSL traffic prior to reaching the server.
This extends further than mere acceleration and move into the area of SSL offloading. By enhancing performance of one’s secure Web server, SSL offloading yields improved customer happiness. but, SSL offloading refers to expanding SSL connection from client to offloader and not from client to server. From offloader, data passes to server through the network unencrypted. This data is traveling across one’s internal network, not the public Internet. Hence, a question arises with regard to the security of that internal network. This security depends to a large extent on one’s network topology. One can boldly permit unencrypted data to pass from offloader to server, if one has the offloader and server installed at the back a department firewall on a secure subnet where only vital servers are situated and to which users do not have straight access. (SSL Acceleration and Offloading: What Are the Security Implications?)
On the other hand, the exposure and threat of compromise after the data is decrypted is much larger, if one are offloading SSL processing to a firewall situated on the network edge. Lastly, one has to consider customer awareness and anticipations. When customers are informed that they are making a safe encrypted connection, they may anticipate the secure tunnel joins the server all the way from client. This is a reasonable guess given the fact that customers are not exposed to technologies such as SSL offloading. In case if secret customer data were retrieved by unofficial persons while wandering over the network in a decrypted state, and then used wrongly, then what liability concerns one will be put through to? (SSL Acceleration and Offloading: What Are the Security Implications?)
The traditional implementation of the SSL involves security concerns in addition to the performance issues. The private keys of the server that are critical for the encrypting of transactions usually reside on an unsecured hard drive and are copied to the processor when required. With the control of the Web server both the processor and the hard drive are vulnerable to the possible security risks. By stealing the private key of the server, it becomes easier for the third-party to establish a server that recreates the original server. The server authentication process could be easily compromised in such type of situation exposing confidential information of a client. Additionally, stealing of a server, stealing of the disk drives of the server or physically monitoring of electrical signals are different ways of compromising of the key information. It is most essential that the private keys are to be safeguarded from attacks of the hackers and physical theft. (the Secure Sockets Layer Protocol – Enabling Secure Web Transactions)
Since SSL is a low level protocol, safeguarding is very less widespread while your host is compromised. Besides, when a key in a certificate is compromised, it will continue to be as such, as there is no existing system to check with the root of a CA to verify the key you are employing has not been cancelled. But the keys contain the terminating dates. Ascending to the root is not an ordinary initiative; however a system must be in place in order to do so for high priced dealings. Out of band key negation is wanted as well. Currently, reliable key certifiers are registered into the Netscape binary system. The application of RC4 is difficult, even if comprehensible. RC4 is a recently available cipher, and even though the extremely proficient Ron Rivest devised it, it has not been exposed to the similar type of examination by the experts which DES and IDEA have been subjected to. The choice of RC4-40 was prompted by the reality that it gets an export approval without by fuss from the State Department. There are several parts while designing the SSL as it is available which might be usability intricacies. Nobody proposes an instant mode of assault, but there are entities, which could be changed for potential extra surety. (Shostack, an Overview of SSL (version 2))
The limitations of SSL protocol emerge at the time while distant users require logging on to client server applications. SSL has been devised to mainly to ensure security of Internet based application users. Several sellers have already made and some are in the process of making client-server elements that execute a remote application proxy on the client computer to get client-server sessions, however costs go up as also the intricacies. Another problem with SSL is the cost factor. VPNs based on SSL are priced extremely on the higher side with $30k for 500 simultaneous users. A majority of the businesses usually dish out that amount for a VPN device that comes with an infinite user license. (News & Analysis: Past, Presentation, and Future) Encryption of SSL gives an impasse in which performance and safety lie at the conflicting points of a scale with an unbalanced performance and results. Eventually, it is up to user to assess his network organization, the type of data that passes through it, and the extent of performance swapping merits upgrading to a better security or otherwise. (Secure Servers with SSL in the World Wide Web) lot remains, which has to be addressed prior to any security communication method is secured to a considerable extent. Information while it is in transit over the Internet is vulnerable to deception and mishandling by other people. The information that gets transmitted from a client workstation and servers employs a routing procedure through which the information travels through a lot of computer systems. Every computer system characterizes a possible danger to access the passage of information between the client and the server. Regrettably, the Internet is not equipped with any internal security system that can avert tampering the data while it is in transit causing cheating, snooping, making illegal copies, causing harm to the communication and so forth. Nearly every organization is confronted with these safety questions. Initiatives originating from SSL protocol to resolve some of the safety questions in the World Wide Web have been responsible for enhanced security over the Internet. (Secure Servers with SSL in the World Wide Web)
Nevertheless, events like Netscape’s security fault underlines concerns about the security in the Internet as a whole. A lot of fallacies are existent that requires to be dispelled. Nearly, every Internet users perceive that security is either in a on or off state. Security as a whole is an arrangement that requires constant reviewing and experimentation. This system although being expensive and intricate, it contributes its worth and generates advantages. The defects present in Netscape just emphasize that security technology is not an ideal but a gradual and ongoing procedure with a broad array of security classes. Security systems are not ideal and total in it and are ordained to be beset with vulnerable spots or holes which are corrected through interaction of network users and hackers. It is ideal that more and more people challenge and experiment the state of security of the SSL. This study reveals that the use of SSL is responsible in a big way for proper functioning of the system and contentment of the user as well. Their potential for providing quicker answer to the transforming setting augmented the popularity for security business. (Secure Servers with SSL in the World Wide Web)
The lawful interception, monitoring, capture and analysis of https session data using network analysis and forensic tools
In reality the SSL is not the way of securing data it is only the method of negotiating the appropriate security instruments. The SSL security depends upon the encrypting and authentication algorithm in addition to the actual SSL specification or implementation. Wag embodies many attacks. Many of them however are, mostly theoretical and fail to succeed. Other attacks are successful in theory and could not be implemented. One of the attack have possibility of implementation however, shows a threat against privacy contrary to the confidentiality and authenticity the primary goals of the SSL. (About SSL/TLS) cryptanalyst will look for more vague ones in case the standard attacks do not succeed. Traffic, analysis though frequently criticized is another passive attack worth bearing in mind. The intention of traffic analysis is to recuperate secretive facts about security sessions by exploring unencrypted packet and unprotected packet attributes. A traffic analyst for instance, can by exploring the unencrypted IP source and target addresses and even TCP ports, or investigating the volume of network traffic, can find out what parties are mingling together, what type of services are in use, and even at times get information about business or personal relationships. (Analysis of the SSL 3.0 protocol) Even though it is not possible to decode the data packets sent one is still capable of assessing its size and also its source and its destination, as they constitute part of the transport layer. The instruments for capturing TCP connections or IP spoofing presently are increasingly available and becomes easier and easier to use. What is the solution?
To illustrate, suppose one desires to trace about the Web pages the respective user visits. He simply has to take a monitoring program and capture TCP data between client and server. The data going out at port 80 on the client are captured with the awareness that those will be URL requests from the browser of the client. It is also possible to capture the response of the server particularly its size. The Web mining technology such as the search engines like Google is being employed to trace out the captured length of the URL on the server return Web pages of the captured size. Since it involves quite a reasonable amount of memory and computation power it is not possible for every one to use this technology. However, one has to take it for granted that the monitoring of the Web traffic involves an intrusion into the privacy of somebody recovering information about the visited Web pages is still more an offence. Unless the SSL do not deal with it effectively, it cannot make the SSL better. This is due to the fact that emphasis was not laid on the privacy actually, when the standard was propounded. (About SSL/TLS)
In reality as SSL does not try to conclude these kinds of traffic analyses, the users normally think that the danger of these kinds of coarse-grained tracking to be fairly risk-free. It appears to be logical design choice in overlooking coarse-grained traffic analysis. But in the SSL architecture, some delicate intimidations are created by traffic analysis. In SSL, particulars about URL or SSL-encrypted Web traffic can be exposed by the inspection of cipher text lengths as established by Bennet Yee. The GET request containing the URL is conveyed in encrypted form, when a Web browser is linked to a Web server through an encrypted transport such as SSL. Though traffic analysis can get back the identity of the Web server, the length of the URL requested, and the length of the html data returned by the Web server, precisely which Web page was downloaded by the browser is obviously considered secretive information and for good reason, as knowledge of the URL is often sufficient for an adversary to get the entire Web page downloaded. Habitually this seep out could lead to spying to find out what Web page was accessed. This defenselessness is because the cipher text length can disclose the plaintext length.1 the SSL instead of the stream cipher modes, assists random padding of the block cipher modes. SSL should keenly take into account the requirements for some applications and also at the minimum level assist the usage of random-length padding for all cipher modes. (Analysis of the SSL 3.0 protocol)
It is essential that SSL should firmly protect private data against vigorous attacks. Obviously, against adaptive chosen-plaintext/chosen-ciphertext attacks, the basic encryption algorithm should be protected, but this is not adequate as such. Refined vigorous attacks on a record layer can break a system’s confidentiality even when the basic cipher is sturdy is the disclosure by a current research initiated by the IETF ipsec working group. These dominant attacks seem to be withstood by SSL 3.0 record layer; the reason for defeat deserves a discussion of some depth. Bellovin’s cut-and-paste attack is one vital vigorous attack on ipsec. It is to be remembered that in addition to link encryption, the receiving endpoint must also safeguard the vulnerable data from involuntary revelation, in order to ensure privacy. The tenet that most endpoint applications will consider inbound encrypted data in a different way based on the context, shielding it more diligently when it comes out in some forms than in others is exploited in the cut-and-paste attack. (Analysis of the SSL 3.0 protocol) fundamental behavior of the cipher-block chaining mode is that it recuperates from errors within one block; therefore, relocating a small number of successive ciphertext blocks among positions inside a ciphertext flow end up in a consequent shift of plaintext blocks but for a one-block error at the start of the splice; this behavior is used to advantage in the cut-and-paste attack. To be more precise, Bellovin’s cut-and-paste attack involves cutting an encrypted ciphertext from a packet with sensitive data, merging it into the ciphertext of a new cautiously selected packet such that the receiving endpoint will be expected to unintentionally reveal its plaintext after decryption. For instance, the cut-and-paste attack can be used to deal with site security if it can attack on the SSL record layer: the sequence can be as follows: a cut-and-paste attack on a SSL server-to-client Web page transfer can merge ciphertext from a susceptible part of that html transfer into the hostname segment of a URL incorporated in another place in the relocated Web page; whenever a user clicks on the booby-trapped URL link, his browser would understand the decryption of the spliced susceptible ciphertext as a hostname and transmit a DNS domain name lookup for it in the clear, ready for take over by the snooping attacker. In essence, cut-and-paste attacks solicit the unwary receiver to decrypt and leak sensitive data by mistake for them. SSL 3.0 prevents cut-and-paste attacks. (Analysis of the SSL 3.0 protocol)
Using autonomous session keys for every different context is one limited defense against cut-and-paste attacks. This averts cutting and pasting among diverse directions of a connection, diverse connections, etc. SSL already utilizes independent keys for each direction of each embodiment of each connection. Even then, this method does not stop cutting and pasting inside one direction of a transfer. Employing strong certification on all encrypted packets to stop enemy alteration of the ciphertext data is the most complete guard against cut-and-paste attacks. As the SSL record layer does utilize this guard, cut-and-paste attacks are fully thwarted. One more vigorous attack against ipsec found in Bellovin’s paper is the short-block attack. The short-block attack was at first applied against DES-CBC ipsec-protected TCP data with the final message block containing a short one-byte plaintext and the balance of it is packed with arbitrary stuffing. To deduce the unfamiliar plaintext byte, one has to swap the last ciphertext block with a different ciphertext block from a familiar plaintext / ciphertext pair. (Analysis of the SSL 3.0 protocol)
By the authenticity of the TCP checksum, correct deductions can be acknowledged: While the correct deduction will lead to return of a familiar ACK, a wrong deduction will lead to the quiet falling of the packet by the receiver’s TCP stack. The enemey can pick up the unfamiliar plaintext byte using the knowledge of the related plaintext for a properly deducted substitute ciphertext block. The short-block attack needs around 28 familiar plaintexts and 28 active online tryouts to pick up such an unfamiliar trailing byte because the receiving ipsec stack disregards the stuffing bytes. There is a considerable simplification of various disturbing procedures. Clear short-block attacks on SSL are nil. As the SSL record layer format is quite comparable to the previous susceptible ipsec layout, it is admittedly plausible that an improved edition of the attack may work in opposition to SSL. Anyhow, as normal SSL-encrypting Web servers do not usually encrypt short blocks, a short-block type of attack would not threaten them. (Analysis of the SSL 3.0 protocol)
In 1996, an effort to produce Internet’s standardized secure method to communicate over the web was made by the IETF task force. At the beginning, they made use of the SSL 3.0 and in 1999 delivered the document RFC 2246 that characterized the new Transport Layer Security – TLS protocol in its version 1.0. The task force laid a fundamental goal to be accomplished through TLS which is similar to the goal connected with SSL’s standards, that is to offer security and data integrity features at the transport layer in between two web applications.
Furthermore, following additional characteristics have been added to TSL by the designers: Interoperability: without the knowledge of TLS execution particulars by either party, TLS to yield to the creation of TLS-enabled applications and exchange of TLS parameters by either party. Expandability: TLS to offer a structure for trouble-free upcoming extensions to be developed in new cryptographic technologies based on the same protocols despite the changes made to the cryptographic protocols. Two basic protocols contained in both TLS and its forerunner SSL are: TLS Record Protocol: used to offer security and integrity for data sent during the client/server session which is similar to SSL record protocol and TLS Handshake Protocol: to consult connection parameters; it executes the similiar function as the SSL handshake described earlier. For the application layer protocols, TLS has been created as a foundation and hence are layered on top of the TLS protocol.
However, a method to shield transmissions by using TLS by these top layer protocols is not given in the specifications in RFC 2246. With the help of the designers of applications and protocols, the TLS task force commenced to put an end to this problem. IETF has developed two supplementary RFC documents: RFC 2817 “Upgrading to TLS within HTTP/1.1” and RFC 2818 “HTTP Over TLS”; this is in addition to developing the protocol specification itself. Both the documents demonstrate an alternate to Secure Sockets Layer technique used so far, namely, the TLS implementation using HTTP protocol. As against SSL protocol, they show, amongst others, a technique of using a TLS secured HTTP connection with no requirement to use any extra port for encrypted connections. To start an encrypted TLS connection, a standard HTTP port is used. The subsequent instance of the application specification for TLS is a way to set up a secure SMTP connection using a standard port and protocol extensions, which is contained in the RFC 2487 document on “SMTP Service Extension for Secure SMTP over TLS.” TLS, as of today, is supported by components of both client server and server. For instance, the Internet Explorer browser supports SSL. Despite this SSL is the most often used method to offer security for Internet communications in recent times. However, TLS, which will develop into an accepted security standard for Internet transmission services, is anticipated to substitute SSL. (Secure Socket Layer: (www.windowsecurity.com)
SSL cryptographically validates vulnerable communications besides shielding the confidentiality of application data. On the Internet, active attacks are turning out to be simpler to initiate every day. Two commercially existing software packages such as IP-spoofing and TCP session hijacking are already known to execute active attacks and they also offer a user-friendly graphical interface. Additionally, there is a swift increase of the financial enticement for manipulating communications security weaknesses. This demands sturdy message authentication. SSL, by means of a cryptographic MAC, safeguards the integrity of application data. HMAC, a plain, swift hash-based structure, is the choice of SSL designers due to its sound theoretical proof for its security. These verifiable security outcomes are very appealing against numerous early ad-hoc proposals for MACs, which have been cryptanalyzed. (Analysis of the SSL 3.0 protocol)
HMAC is an outstanding selection for SSL and is developing into the gold standard of message authentication. Except for chief unforeseen cryptanalytic developments, it appears improbable that HMAC will be busted in the near future. We bring to notice that SSL 3.0 utilizes an older outdated version of the HMAC construction. For utmost security, SSL must progress to the restructured up-to-date HMAC format when appropriate. In overall terms, SSL 3.0 seems to be very safe against direct thorough or cryptanalytic attacks on the MAC. SSL 2.0 employed an unsafe MAC, which is a grave design fault, however, post-encryption rescued this from being a straight weakness; but SSL 3.0 has resolved this error. The SSL MAC keys through their no less than 128 bits of entropy, offer brilliant protection even in export-weakened and domestic-grade implementations. Autonomous keys are earmarked for each direction of each connection and for each new incarnation of a connection. (Analysis of the SSL 3.0 protocol)
The selection of HMAC itself is expected to end cryptanalytic attacks. Non-repudiation services are not in the scope of SSL and hence it rational to purposely move that to particular higher-level application-layer protocols. The inexperienced use of a MAC does not essentially end an opponent from repeat stale packets. Though repeat attacks can be effortlessly tackled, they are a genuine anxiety and hence it is foolish to ignore these intimidations. By incorporating an inherent sequence number in the MACed data, SSL defends against repeat attacks. Additionally, this system defends against delayed, re-ordered, or deleted data. As sequence numbers are 64 bits long, wrapping should not be a difficulty. As sequence numbers are kept discretely for each direction of each connection, and are revived upon each new key-exchange, so there are no apparent weaknesses. (Analysis of the SSL 3.0 protocol)
SSL is an all-encompassing desktop technology employed to ensure security of the traffic transient in the Internet, and all browser softwares contains it. SSL VPNs capitalize on the all-encompassing characteristics of the browser-integrated SSL client to give safe, client-free access to internal resources of the company. SSL VPNs are coming up to be feasible substitute to total VPNs, and are specifically well-matched for traveling employees and extranet applications in cases where secured, limited access to a certain group of applications is necessary. While employed in this manner, SSL VPNs are simple to install and continue compared to conventional IPSec VPNs. SSL VPN is the much sought after answer for extending application access beyond the security firewall of a business enterprise. (SSL VPN: IPSec Killers or Overkill?)
Due to a deluge of latest products made on the present web switching platforms, a separate new competitive market for remote access based on SSL has been made. During the previous year, retailers have progressed SSL VPN technology to include much more compared to web-based applications only. The current SSL VPN products can transmit in a secured fashion thin, web distributed applications and fat-client efficiency instruments, inclusive Microsoft Outlook, Lotus Notes and Citrix. Regrettably, these voluminous applications are not SSL permitted in the local machine. In order to hold these clients, SSL VPN retailers coded tiny client-side JAVA and ActiveX clients that interrupt and dispatch traffic acting for the fat client. The retailers selling SSL VPN go on naming this as client free access, while the fact of the matter is that, they have installed a thin client for encapsulating local traffic application within SSL channels. (SSL VPN: IPSec Killers or Overkill?)
SLL VPNs lessens the management load of remote access in a great measure, since as opposed to IPSec VPNs particular applications only are allowed through SSL VPN, lessening the possibilities for illegal network invasion. A lot of technical and security benefits are offered by SSL VPNs compared to IPSec VPNs. SSL is incorporated in almost all desktop, handheld devices and kiosks. Through the method of tunneling, applications over SSL removes extra holes in firewall of the enterprise, lessening the dangers of security breaches. SSL is simpler to install compared to IPSec since a lot of firewalls of corporate systems already go through SSL traffic to back e-commerce. SSL traffic can travel through Network Address Translation (NAT) without any glitches, but IPsec needs extra care. More significantly, SSL VPNs provides the system administrator access control based on each user to a stringently individual listing of applications. Each of these benefits explains the causes for an enhanced security and a lesser overall cost of possession. (SSL VPN: IPSec Killers or Overkill?)
While SSL VPNs retailers initiate support for network-layer access, similar to a lot done previously, every security benefit of SSL VPN goes totally wasted. This is due to the fact that SSL VPNs can be started off from almost any client, a huge risk of the entire network getting infected from Trojan Horses, viruses and vicious code looms large on an alien, unsecured workstation. This danger usually is not the case in IPsec VPNs as the network manager usually has a strong authority on the workstation in which the IPsec client in configured. IPsec clients enable client integration at the driver level as well, guaranteeing much more intricate applications like Microsoft Net Meeting to be executed perfectly. SSL VPN clients who are downloaded are not able to accomplish this stage of integration. SSL VPN retailers will be put to a lot of dilemma while their network-layer access is unable to present the identical performance as an IPsec VPN. SSL VPNs and IPSec VPNs were created to tackle divergent security problems in varied settings. (SSL VPN: IPSec Killers or Overkill?)
By uniting the application and network layer at a time in an SSL VPN in a bungling manner, both the technologies are unable to function optimally and make a network security problem. Consumers reviewing secure remote access technologies must be made aware that network layer SSL VPNs pose a threat to their network and must inquire retailers regarding the manner in which they deal with the perils of viruses, Trojan horses, and other damaging codes present on the remote host system while configuring SSL-based VPN access from alien hosts. It is ideal that customers must adhere to IPsec for network layer access, and employ SSL VPNs to find a solution to the application layer access intricacies for which SSL VPNs were built to deal with. (SSL VPN: IPSec Killers or Overkill?)
Technology in its totality and the things it can supervise is confined to the traffic which can be supervised. This means encrypted traffic like SSL and SSH remains unnoticeable by nearly all Network Forensic Analysis Tools -NAFTs. The dilemma at this point is that SSL-enabled Web Servers are susceptible to identical misuses. In case an SSL-enabled web server is kept within production settings without any patches, an unscrupulous user can unleash assaults without getting noticed from the monitoring instrument. (Network Forensics Analysis Tools: An Overview of an Emerging Technology) as hackers are planning to take advantage of the defects, Microsoft is advising customers to right away install a current software patch for Secure Socket Layer vulnerabilities in Windows. The patch, MS04-011, appraised as ‘critical’ by Microsoft, was made accessible on 13 April 2004. It patches 14 discrete vulnerabilities, primarily bugs in the SSL of all Windows systems. According to Stuart Okin, chief security officer at Microsoft UK, due to amplified activity amongst the hacking community, there is an enlarged risk possibility; also as exploit code is released, many sources are discussing about bringing out exploit tools; hence, it is suggested that customers install the patches at once. (Microsoft warns of SSL attacks)
In order that SSL present a secured network, the client and server systems, keys, and applications everything ought to be secure. Apart from that, the execution must be a zero error security. The system is just as robust as the weakest key exchange and verification algorithm backed and only reliable cryptographic functions must be employed. Small public keys, 40-bit massive encryption keys, and unidentified servers must be used with huge vigilance. Functionaries and users must exercise caution at the time of choosing which certificates and certificate authorities are good enough since an unscrupulous certificate authority can unleash immeasurable devastation. (Security analysis)
The exploration of how access to additional client and/or server data could be used to assist in the de-encryption of the data captured
How can access to additional client or server data be used to assist in the de-encryption of captured data? Today, as we have seen the issue of network security is a major one that concerns all Internet users who send information of any sort across the Net. Security analysis is an important factor that governs the safety of these documents, and there are many different ways of ensuring this safety. One of these is the ‘Asset Analysis’ method. This method analyses the loss of value in a network, and also analyses the ability to communicate. ‘Threat analysis’ is another measurement of the losses that are occurring due to lapses in security, and this analyses any sort of threats like intentional attacks on the network, as well as any unintended mistakes that are being committed and which are resulting in lapses in security. When there are any inherent weaknesses in design, then these may lead to losses.
An analysis of these weaknesses in design is referred to as the ‘vulnerability analysis’. The last step of all is the ‘risk analysis’ that is the total of the previous three steps of threat analyses. However, since the entire system cannot be dealt with in detail at this point, the security blocks that can build up the security of the end product is analyzed here. This is the preservation of service, which in other words means that the ‘denial of service’ attacks that sometimes block the free availability of service by causing a bad network configuration that causes in turn mistakes that block the service are dealt with in a manner that preserves or retains service for the network user in spite of malicious intent. Access control is also performed in some cases. This means that access would be given only to authorized persons and not to unidentified persons who may block the entire service. In addition, data integrity is also preserved and access to unauthorized eavesdroppers to information on the site is controlled. In cases where unidentified and unauthorized personnel have actually captured data, they can launch active attacks and this threatens the very integrity of the information that is being sent over the Internet. (White Paper: Building secure networks using Xpeedium2â„¢ devices)
The SQL Pass-Through Technology – SPT allows a user to send any SQL Statements to a server, directly. The advantage of sending the statements directly is that they execute generally on the back-end server, and therefore enhance the performance of the client/server applications. (SQL Pass Through) the SPT has been described as one of the most important and advanced developments in the client-server application improvement systems, especially when using the ‘stored procedure method’. This method is one whereby a collection of Transact SQL statements are grouped together and compiled under a particular name and are generally treated as one single unit. They can be used for the purpose of managing an SQL Server and also for displaying any information about users and other databases. (Stored Procedure) the process of Stored Procedure using the SPT has been commended for its efficiency in client -server development, particularly in relation to the comparison of ‘remote views’.
Remote views are those that use the data that is available outside the current database, like for example, the SQL Server. In other words, the particular SQL code that the server is familiar with is used in order to create views, and this is extremely useful to the user. The main advantage of a remote view is the ease with which it can be used. (Remote Views) a remote view is generally considered to be the least scalable of the techniques that are available, mainly because, for the entire time that the remote view is being used, the connection must be left open to the database. In cases where the middle tier is used for the remote view this can actually be minimized, and therefore some control can be exerted, but this generally requires additional effort from the user. In some cases, remote views can be accessed by one single connection, and all the SPT traffic can be directed to another connection, and this makes things definitely easier as well as faster, and there would also be less confusion on the part of the application as such and from the point-of-view of the programmer. Therefore it is clear that SPT, all SPTs that have stored procs, and ADO – ActiveX Data Objects are all extremely scalable.
While the process of data retrieval and that of data updating is going on, the connections need to e kept open, and that is all. In reality, ADO is the only native procedure that allows data to be passed through a remote data access technique that also allows data to pass through different tiers. However, one problem that is often faced by the usage of ADO is that it needs a COM call every time. For example, when a scan loop of a VFP passes through a local cursor, and affects the record, and an ADO is made to do a similar thing; the ADO calls quickly multiplies into hundreds and thousands. This causes orders where the performance differences between each are of different magnitudes. What must be remembered is that using any technique other than that of remote views does not indicate that scalability will be increased automatically; it is necessary for the program developer to write a code that would close all connections when they are not being used or when they are not needed, and also to start and re-start the connections whenever they are needed so that any gains due to scalability are achieved. The snare here is that quite a few developers do not bother with closing and opening connections at all times.
This is because, when the connection is open the information is more readily available and is also quickly available, whereas when the connection is closed and then opened, the information will take more time to be processed and to be available to the user. In order to counter this, the system of MTS caching has been developed, and the COM objects under the MTS caching are capable of connecting as well as disconnecting any number of times. MTS is therefore trained to cache almost all ODBC functions, including any connections to the SQL server. When a remote view is to be used, it needs a Visual FoxPro. DBC which in other words is a procedure file that is in a.DBF format. When the middle tier is also used, then the performance becomes optimum and excellent. Therefore, the programmer when writing the code must make sure that he has added the.DBC code to the application start up routine so that it can be made sure that all applications are actually using the latest version of the DBC. (Remote Views)
In a system where remote views, SPTs, and SPTs with stored procs in the middle tier are used, it is better to make sure that the middle tier is provided with a data that allows the passage of data between the tiers. In doing this, VFP cursors may be converted to ADO record sets, and also to XML, and the VFP cursors can also be converted to Scatter Name objects, and to arrays, while it must be remembered that the ADO is the only data access process by which data can be retrieved from the server and this data can be passed on directly to the client without having had to perform needless data conversions. When the issue of security arises, all remote views need certain specific information about a user from a user account that would provide information like retrieval data, insertion data, and information on updates and deletions if any. When it is SQL Pass-Through that is being used, then a user account must be in existence, and this user account must specify access to data on retrievals and insertions and also updates and finally, deletions.
An SPT with stored Procs, a user account must be in existence that offers executive rights on the procedures that are being called. In the case of a 7.0, a guest account can be used to connect to the server, and an ap role can be used to upgrade the security up until the query is answered. When an SQL has been constructed on the fly with the help of ADO, then the security that is required is the same as that required for the SPT, and when ADO is used to call upon any stored procedures, then the security will be the same as that of that of SPTs with Stored Procs. In the case of a remote view data, it can be said that the remote view is the only method in which data can be accessed throughout VFP, making it the most flexible and easy to use of all data accessing techniques. The methods of ‘standard data binding’ and ‘VFP Report Writer’ are generally used, which enable users to activate the functions of ‘Requery ()’, ‘Table Update’ (), and ‘Table Revert ()’, all of which are functions that are related to buffering. (Remote Views)
All these functions can be viewed on the remote view access without doing any additional jobs and procedures for access of information. The VFP View Designer creates very simple remote views, whereas the more complicated ones are created using tools like the xCase and the E. View, or by a system of hand coding which is in general a very difficult process. While remote views can be used for the access of different back-ends, and can be used in the format of a table or a form and report data that can be used with the standard drag and drop formula, the SPT and the ADOs and the SPT with stored Procs all require more thorough knowledge of the back-end than the remote view needs. Furthermore, the use of the SPT and the ADOs and the SPT with stored Procs involves the use of more code than would the use of the remote view. (Remote Views)
VFP developers are able to treat the relevant data in certain remote databases such as the SQL Server at par with the local VFOP Tables, and this is because of the numerous advantages offered by the remote view system. The entire development becomes simpler and less complex and therefore more comprehendible. As compared to the SQL Pass-through, otherwise known as the SPT, and the SP and the ADO, the remote view demonstrates more advantages and less disadvantages, especially if the development of more than one single database is being sought. When a remote viewing is used, a system that works against any database can be written, with no or very little extra work, since the codes do not have to be changed and there is no need for great experience in using the backend databases like SQL Server, oracle, and so on. In contrast, an SP system would need a different database version for each and every system, and all these would have to be developed and maintained separately. (More on Remote Views: Rational Decision-making about Remote Views)
The system of sharing files and printers via the computer is known as the Common Internet File System, and this system is also known as one of the network protocols that deal with file and information sharing. The sharing is generally done between two entities, these being the client, and the server. In order to share something, the user specifies the file or the printer that is to be shared on the server. Maybe in the future the client would be able to log on to the server and then be able to send a CIFS – Common Internet File System package on to the Internet and thereby demand access to the particular resource that he wants and then give his password and name to the server. When this is done and the protocol is over, then he can start using the files that he has indicated that he wants to use, and start using them on the remote computer.
The main reason why the CIFS is so very popular is because it is the system that is used by all Microsoft Operating Systems. For example, this is the system that is used in Workgroups, Windows 95, and Windows 2000, and so on. It is also possible for users of Macintosh and Unix Operating Systems to use third party software to become capable of CIFS Systems. The potential for the marketing of the CIFS Protocols are tremendous and the market for the system seems to be growing every day. For example, cell phones as well as MP3 Players are getting connected to the Internet, quite a few home users of the Internet are starting to use a Local Area Network for the purpose of sharing files and printers. This trend reveals that in the future, CIFS enabled Electronic Devices have the potential of offering exciting new services to the interested consumer. It is a fact that a CIFS enabled computer is capable of gaining access to any file or printer that is being shared on the Internet or the LAN – Local Area Network.
Therefore, it is true that devices can be connected to CIFS in new and novel methods and this could create new gadgets that are more useful to the consumer. One example would be that of a cell phone. This device could be connected to a CIFS client to access wirelessly a file that is actually located on a desktop or maybe he could e the cell phone to access his address book from the Outlook Express. Therefore it can be said that CIFS has the capability of opening up a whole new and exciting world of CIFS enabled electronic devices for the customer of today. Data that could not be accessed a few years ago now is easy to access and de-encryption becomes an easy task. The eCIFS is a portable device that has been developed and designed and created by Code FX. It contains what is known as the ‘C’ Source code implementation and this device allows users to communicate on a CIFS Network, that means that file and printer sharing is accomplished through the LAN. These are the inherent benefits of using a program like the eCIFS: a product that has been CIFS enables can speak the language of CIFS almost at once, and this allows immediate access to the shared files and printers.
The other advantage of using the eCIFS is that there is no need for any storage necessities; the remote file access does not in fact need such access, nor does it need programs such as flash memory units, and nor does it need any special hard disk, and this in turn makes the designer become more innovative in his form. When eCIFS needs to accomplish a system specific function, then another layer is used that is referred to as the ‘abstraction layer’ and this layer allows all the system specific utilities to be bound in one particular generic call. This is used for functions such as the TCP/IP networking, any inter-process communication and the system of thread spawning. When it is necessary to use the eCIFS in another new system the contents of the generic calls are modified in a way that would match the function calls of the new system. All the generic code calls are contained in four files, and most of the bulk of the eCIFS source coding is left untouched and unchanged. However, the TCP/IP network protocol must be present in the new system. (e CIFS in Depth.) the eCIFS can be taken as another method in which to access client or server data after which de-encryption of the captured data can be achieved.
Test2day was started in the year 2002 as a specialist consultancy firm that excels in the creation and the construction and the management and the design of performance tests for business systems. The test assets that the company develops use the best of the methods and tools available anywhere. The company has among its various services the one of advising their clients on the best method in which to evaluate the performance of their new systems or of their upgraded systems. The phases of a test procedure are of primarily, operational profiling, secondly, that of testing the design, thirdly, that of making or building the test, depending on the configuration of data and that of the monitor and also that of the scripts. The next step is the actual testing phase, and last is the analysis of the results of the testing that has been conducted. (Company Overview)
The company Test2day feels that in today’s world where there have been many instances of failure of systems where a company has been forced to shut down its offices, it is mainly due to, they say, that testing has not been conducted in a proper and efficient manner before its actual deployment. It was not very long ago that the client- server systems were planned out to be organized within the company for a known number of users, and the idea of the testing of performance levels was started at this time in order to make sure that the business enjoyed continuity. Now most applications become exposed because of the large number of users of the Internet and the Extranet. This is the reason that multi-tier architecture has been increasing significantly and the loads that they may have to bear have become difficult to adjudge or predict. This necessitates the fact that the performance levels of the system must be tested before it is exposed. Therefore, it can be stated that performance testing is a virtual must today. (Company Overview)
The various areas that are evaluated for their performance levels are the response time that the system takes as the load increases. This is called the stepped-load test. The stress test investigates the maximum number of users that the system can handle after it has passed the stage of ‘bend’ or ‘kink’, which indicate the peak after which the system is not able to process queries at an optimum speed. Sometimes the performance of the system is evaluated over a long period of time to check and evaluate any lapses and leaks in memory and in other issues. This is called the ‘extended load test’ or the ‘soak testing’ method. The ‘scalability test’ is the one in which the same test is run over and over again on one particular target so that the different levels at which the performance rates can be scaled. When multiple virtual users are asked to use the system at the same time to check the defects or mistakes that can arise because of concurrent usage, it is called the ‘spike test’.
The system is sometimes made to fail while on an under load so that the performance level at this state can be investigated. This is the ‘fail over under load test’. In addition, the tests of ‘regression’, ‘network load’, and the ‘logon test’ are also conducted to evaluate performance levels. Finally, post-deployment monitoring is also conducted so that the post deployment of the system and any mistakes can be modified and corrected at this early stage. (Company Overview) This type of testing helps to optimize performance levels as well as to assist in the de-encryption of captured data.
SSL is a functioning layer of protocols that generally is placed beneath the application layers like IMAP, HTTP, and the SMTP. When the information or the messaging that is passed between the client and the server as well as with other servers happens to be encrypted, then this information can be considered to be safe, and not available to eavesdroppers. It is a fact that if information between the various parties is authenticated then there is even lesser chance of intrusions or eavesdropping. When SSL is combined with SMTP, HTTP and IMAP then end-to-end encryption of transmissions between the server and the client becomes optimized. When a hardware encryption accelerator is used in addition to the above in the case of an SSL connection, then the performance of the server can be improved to a greater extent. This is the process that an SSL enabled connection follows in its communication between the client and the server. (Designing a Secure Messaging Server)
When the client starts the communication he uses the HTTPS system to specify the secret-key algorithms that he can use, and then the server replies by sending the client the certificate specifying the secret-key algorithm that can be used, with its authentication. The strongest algorithm that the server is in possession of will be stated and the client’s algorithm has to match it approximately. If there is a wide disparity, then the entire connection will be refused. When the client gets the authentication certificate he must take the time to check it thoroughly for its signature from a well-known authority, for its expiration data that must be mentioned clearly, whether it has been certified by a good certifying authority, and if the host name on the certificate is the same as that off the name of the server that has been mentioned in the HTTPS request by the client. (Designing a Secure Messaging Server)
Sometimes, the SSL functions with the algorithm of the cipher. This is the basic algorithm that is used for both encryption and for decryption in the encryption procedure. There may be different types of ciphers, in other words, a cipher can be very strong or very weak, or somewhere in between. The stronger the cipher is, the more difficult it is for an outsider or an unauthorized person to try to unscramble the message within. The method of operation is for a cipher to work on the data, by using a particular key. Therefore it is quite natural that the longer the cipher takes to finish its encryption process using the key, the more difficult it actually is for decryption of the encrypted messages, without the proper and appropriate decryption key. Generally, when an SSL connection is being established between the client and the server, the client must specify the cipher and the particular length of the secret-key that it prefers for the process of encryption, and the server in any further communication must learn to use the same cipher. (Designing a Secure Messaging Server)
The server must encourage the client in the use of a particular cipher, on account of the fat that there are in common usage more than a few cipher-and-key combinations that are widely used by the client and the server communications in messaging. When a particular message has been both signed and encrypted by the client when sending it over the Internet, it is called ‘Secure/Multipurpose Internet Mail Extensions’ or S/MIME messages. Generally client-to-client encrypted messages are often sent through the S/MIME system. When a client wants to send a message, he can encrypt the message before it is sent to the receiver. The receiver of the encrypted message can either store the message, or decrypt it immediately in order to read it. In the usage of the S/MIME system, there is no need for any sort of special messaging program; the messages can be sent and received by the clients, and end-to-end encryption is also provided. (Designing a Secure Messaging Server)
The reason why data must be encrypted is explained to Lynn Osborne of the DCC. The author in her letter states that it is almost impossible to maintain privacy while transferring data over the Internet. She explains that no one single person can be trusted to keep the information confidential and private. Therefore it is vitally essential for computer users to learn the art or science of encryption and thereby ensure that their privacy is maintained. (Data Security: Human Rights Act: Library Bylaws People’s Network (PN) computer systems in Devon libraries) De-encryption of captured data can be done but for technical reasons only. Humane reasons however dictate that privacy must be respected.
About SSL/TLS. Retrieved at http://www.cs.bham.ac.uk/~mdr/teaching/modules03/security/students/SS8a/SSLTLS.html. Accessed on 1 September, 2004
Analysis of the SSL 3.0 Protocol. Retrieved at http://www.pdos.lcs.mit.edu/6.824-2001/lecnotes/ssl96.txt. Accessed on 2 September, 2004
Beginners Guides: Encryption and Online Privacy. Retrieved at http://www.pcstats.com/articleview.cfm?articleid=252&page=2Accessed on 1 September, 2004
Boone, Kevin. Secure Servers. Retrieved at http://www.ablestable.com/resources/library/articles/business/business004.html. Accessed on 1 September, 2004
Bravo, Alejandro. Secure Servers with SSL in the World Wide Web. Retrieved from www.giac.org/practical/GSEC/Alex_Bravo_GSEC.pdf. Accessed on 1 September, 2004
CIFS in Depth. 2001. Retrieved at http://www.codefx.com/eCIFS_In_Depth.pdf. Accessed on 2 September, 2004
Client Server / Data access Techniques. Retrieved at http://www.afpfaq.de/mirror/fox.wikis.com/Client-ServerDataAccessTechniques.htm. Accessed on 2 September, 2004
Client. Retrieved at http://www.webopedia.com/TERM/c/client.html. Accessed on 2 September, 2004
Company Overview. Retrieved at http://www.test2day.co.uk/downloads/brochure.pdf. Accessed on 2 September, 2004
Conover, J. SSL VPN: IPSec Killers or Overkill? Retrieved from Current Analysis. http://www2.cio.com/analyst/report1816.html. Accessed on 1 September, 2004
Data Security: Human Rights Act: Library Bylaws People’s Network (PN) computer systems in Devon libraries. Retrieved at http://www.seered.co.uk/intern2.htm. Accessed on 2 September, 2004
Designing a Secure Messaging Server. Retrieved at http://docs.sun.com/source/817-6440/security.html. Accessed on 2 September, 2004
Dierks, T; Allen, C. The TLS Protocol Version 1.0. January 1999. Retrieved at http://www.ietf.org/rfc/rfc2246.txt. Accessed on 3 September, 2004
Enabling technologies Secure Sockets Layer (SSL). Retrieved from Retrieved at http://sellitontheweb.com/ezine/tech20.shtml. Accessed on 1 September, 2004
Farrow, Rik. SSL is not a magic bullet. Retrieved at http://www.spirit.com/Network/net1100.html. Accessed on 1 September, 2004
Freier, Alan O; Karlton, Philip; Kocher, Paul C. “The SSL protocol” Retrieved at http://home.netscape.com/eng/ssl3/ssl-toc.html. Accessed on 1 September, 2004
Haeni, Reto E. IPV6 vs. SSL: Comparing Apples with Oranges. January 1997. Retrieved from www.mabuse.de/sources/ipv6_ssl.pdf. Accessed on 1 September, 2004
Heinrich, Clemens. Secure Socket Layer (SSL) Retrieved from www.win.tue.nl/~henkvt/ClH.SSL.pdf. Accessed on 1 September, 2004
Hetherington, Sally. Internet Security – SSL Explained. Retrieved at http://www.bizland.co.za/articles/technology/sslexplained.htm. Accessed on 2 September, 2004
How does ssl work? Detailed SSL – Step 1 Determine Secure Communication
How does ssl work? Simplified SSL – About Secure Sockets Layer and HTTPS. Retrieved at http://www.ourshop.com/resources/ssl.html. Accessed on 2 September, 2004
Introduction to Secure Sockets Layer. Retrieved at http://www.cisco.com/en/U.S./netsol/ns340/ns394/ns50/ns140/networking_solutions_white_paper09186a0080136858.shtml. Accessed on 1 September, 2004
Introduction to SSL. Retrieved at http://developer.netscape.com/docs/manuals/security/sslin/. Accessed on 3 September, 2004
More on Remote Views: Rational Decision-making about Remote Views. Retrieved at http://www.afpfaq.de/mirror/fox.wikis.com/MoreOnRemoteViews.htm. Accessed on 2 September, 2004
Netscape. “How SSL Works” Retrieved at http://developer.netscape.com/tech/security/ssl/howitworks.html. Accessed on 3 September, 2004
News & Analysis: Past, Presentation, and Future. Network Magazine. Retrieved from www.networkmagazine.com/shared/printableArticle.jhtml Accessed on 1 September, 2004
Protecting Confidential Information. Retrieved From https://www.rsasecurity.com/solutionsTertiary.asp?id=1135Accessed on 2 September, 2004
Remote Views. Retrieved at http://www.afpfaq.de/mirror/fox.wikis.com/RemoteViews.htm. Accessed on 3 September, 2004
SafeEnterprise 2012. Retrieved at http://www.safenet-inc.com/products/igate/netswift2012.asp. Accessed on 2 September, 2004
Secure Socket Layer (SSL) Retrieved from www.dynamic.com/downloads/marketing_documents/StoreSense/ssl.pdf Accessed on 1 September, 2004
Secure Socket Layer. July 19, 2002. Retrieved at http://www.windowsecurity.com/articles/Secure_Socket_Layer.html. Accessed on 1 September, 2004
Security analysis. Retrieved at http://www.ods.com.ua/win/eng/security/ssl3/appf.phtml. Accessed on 1 September, 2004
Server. Retrieved at http://www.webopedia.com/TERM/s/server.html. Accessed on 2 September, 2004
Shostack, Adam. An Overview of SSL (version 2). May 1995. Retrieved at http://www.homeport.org/~adam/ssl.html. Accessed on 3 September, 2004
Sira, Rommel. Network Forensics Analysis Tools: An Overview of an Emerging Technology. SANS Institute 2003. Retrieved from www.giac.org/practical/GSEC/Rommel_Sira_GSEC.pdf. Accessed on 1 September, 2004
SQL Pass Through. 29 September, 2002. Retrieved at http://www.afpfaq.de/mirror/fox.wikis.com/SQLPass-Through.htm. Accessed on 2 September, 2004
SSL — Supported Methods. Retrieved at http://www.ietf.org/proceedings/95apr/sec/cat.elgamal.slides.html. Accessed on 3 September, 2004
SSL (Secure Sockets Layer). Retrieved at http://www.wedgetail.com/technology/ssl.html. Accessed on 3 September, 2004
SSL Acceleration and Offloading: What Are the Security Implications? Retrieved at http://www.windowsecurity.com/articles/SSL-Acceleration-Offloading-Security-Implications.html. Accessed on 1 September, 2004
SSL. Retrieved at http://www.webopedia.com/TERM/S/SSL.html. Accessed on 3 September, 2004
Stored Procedure. 28 October, 2001. Retrieved at http://www.afpfaq.de/mirror/fox.wikis.com/StoredProcedures.htm. Accessed on 2 September, 2004
The Secure Sockets Layer Protocol – Enabling Secure Web Transactions. 3 February, 2002 Retrieved at http://www.itsecurity.com/papers/rainbow3.htm. Accessed on 1 September, 2004
The Secure Sockets Layer Protocol. Retrieved at http://www.cs.bris.ac.uk/~bradley/publish/SSLP/chapter4.html. Accessed on 1 September, 2004
Thomson, Iain. Microsoft warns of SSL attacks. 26 April 2004. Retrieved at http://www.networkitweek.co.uk/news/1154653Accessed on 3 September, 2004
Wagner, David; Schneier, Bruce. Analysis of the SSL 3.0 protocol. Retrieved at http://www.schneier.com/paper-ssl.pdf. Accessed on 1 September, 2004
White Paper: Building secure networks using Xpeedium2â„¢ devices. Retrieved From: www.switchcore.com/products/whitepapers/security_with_xpeedium2/wp_security_letter.pdf. Accessed on 2 September, 2004
Why CDAP and Not SSL/HTTPS. Retrieved from www.connecterra.com/wp/whynotssl.pdf. Accessed on 3 September, 2004
Are you busy and do not have time to handle your assignment? Are you scared that your paper will not make the grade? Do you have responsibilities that may hinder you from turning in your assignment on time? Are you tired and can barely handle your assignment? Are your grades inconsistent?
Whichever your reason is, it is valid! You can get professional academic help from our service at affordable rates. We have a team of professional academic writers who can handle all your assignments.
Students barely have time to read. We got you! Have your literature essay or book review written without having the hassle of reading the book. You can get your literature paper custom-written for you by our literature specialists.
Do you struggle with finance? No need to torture yourself if finance is not your cup of tea. You can order your finance paper from our academic writing service and get 100% original work from competent finance experts.
While psychology may be an interesting subject, you may lack sufficient time to handle your assignments. Don’t despair; by using our academic writing service, you can be assured of perfect grades. Moreover, your grades will be consistent.
Engineering is quite a demanding subject. Students face a lot of pressure and barely have enough time to do what they love to do. Our academic writing service got you covered! Our engineering specialists follow the paper instructions and ensure timely delivery of the paper.
In the nursing course, you may have difficulties with literature reviews, annotated bibliographies, critical essays, and other assignments. Our nursing assignment writers will offer you professional nursing paper help at low prices.
Truth be told, sociology papers can be quite exhausting. Our academic writing service relieves you of fatigue, pressure, and stress. You can relax and have peace of mind as our academic writers handle your sociology assignment.
We take pride in having some of the best business writers in the industry. Our business writers have a lot of experience in the field. They are reliable, and you can be assured of a high-grade paper. They are able to handle business papers of any subject, length, deadline, and difficulty!
We boast of having some of the most experienced statistics experts in the industry. Our statistics experts have diverse skills, expertise, and knowledge to handle any kind of assignment. They have access to all kinds of software to get your assignment done.
Writing a law essay may prove to be an insurmountable obstacle, especially when you need to know the peculiarities of the legislative framework. Take advantage of our top-notch law specialists and get superb grades and 100% satisfaction.
We have highlighted some of the most popular subjects we handle above. Those are just a tip of the iceberg. We deal in all academic disciplines since our writers are as diverse. They have been drawn from across all disciplines, and orders are assigned to those writers believed to be the best in the field. In a nutshell, there is no task we cannot handle; all you need to do is place your order with us. As long as your instructions are clear, just trust we shall deliver irrespective of the discipline.
Our essay writers are graduates with bachelor's, masters, Ph.D., and doctorate degrees in various subjects. The minimum requirement to be an essay writer with our essay writing service is to have a college degree. All our academic writers have a minimum of two years of academic writing. We have a stringent recruitment process to ensure that we get only the most competent essay writers in the industry. We also ensure that the writers are handsomely compensated for their value. The majority of our writers are native English speakers. As such, the fluency of language and grammar is impeccable.
There is a very low likelihood that you won’t like the paper.
Not at all. All papers are written from scratch. There is no way your tutor or instructor will realize that you did not write the paper yourself. In fact, we recommend using our assignment help services for consistent results.
We check all papers for plagiarism before we submit them. We use powerful plagiarism checking software such as SafeAssign, LopesWrite, and Turnitin. We also upload the plagiarism report so that you can review it. We understand that plagiarism is academic suicide. We would not take the risk of submitting plagiarized work and jeopardize your academic journey. Furthermore, we do not sell or use prewritten papers, and each paper is written from scratch.
You determine when you get the paper by setting the deadline when placing the order. All papers are delivered within the deadline. We are well aware that we operate in a time-sensitive industry. As such, we have laid out strategies to ensure that the client receives the paper on time and they never miss the deadline. We understand that papers that are submitted late have some points deducted. We do not want you to miss any points due to late submission. We work on beating deadlines by huge margins in order to ensure that you have ample time to review the paper before you submit it.
We have a privacy and confidentiality policy that guides our work. We NEVER share any customer information with third parties. Noone will ever know that you used our assignment help services. It’s only between you and us. We are bound by our policies to protect the customer’s identity and information. All your information, such as your names, phone number, email, order information, and so on, are protected. We have robust security systems that ensure that your data is protected. Hacking our systems is close to impossible, and it has never happened.
You fill all the paper instructions in the order form. Make sure you include all the helpful materials so that our academic writers can deliver the perfect paper. It will also help to eliminate unnecessary revisions.
Proceed to pay for the paper so that it can be assigned to one of our expert academic writers. The paper subject is matched with the writer’s area of specialization.
You communicate with the writer and know about the progress of the paper. The client can ask the writer for drafts of the paper. The client can upload extra material and include additional instructions from the lecturer. Receive a paper.
The paper is sent to your email and uploaded to your personal account. You also get a plagiarism report attached to your paper.
Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.
You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.Read more
Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.Read more
Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.Read more
Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.Read more
By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.Read more