401 Unauthorized Request workaround when using HttpWebRequest

If you ever encountered the dreaded “401 unauthorized” error when trying to get the HTML code from a URL on the same domain due to the user from the application pool being different from the user logged in (windows authentication), this short snippet will show you how to work around this issue:

System.Net.HttpWebRequest request = (System.Net.HttpWebRequest)System.Net.WebRequest.Create(url);
                request.Credentials = System.Net.CredentialCache.DefaultCredentials;
                request.AuthenticationLevel = System.Net.Security.AuthenticationLevel.MutualAuthRequested;
                System.Net.HttpWebResponse response  = (System.Net.HttpWebResponse)request.GetResponse();

                if (response.StatusCode == System.Net.HttpStatusCode.OK)
                {
                    System.IO.Stream receiveStream = response.GetResponseStream();
                    System.IO.StreamReader readStream = null;

                    if (response.CharacterSet == null)
                    {
                        readStream = new System.IO.StreamReader(receiveStream);
                    }
                    else
                    {
                        readStream = new System.IO.StreamReader(receiveStream, Encoding.GetEncoding(response.CharacterSet));
                    }

                    string data = readStream.ReadToEnd();
                }
Advertisements

How do I install Active Directory management tools on Windows 7?

The Remote Server Administration Tools (RSAT) for Windows 7 can be downloaded from Microsoft’s web site:
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=…

Remote Server Administration Tools for Windows 7

You can install the Administration Tools pack on computers that are running the Windows 7 operating system, and use the Administration Tools pack to manage specific technologies on computers that are running either Windows Server® 2008 R2, Windows Server 2008, or, in some cases, Windows Server 2003.

The Administration Tools pack includes support for remote management of computers that are running the Server Core installation option of either Windows Server 2008 R2 or Windows Server 2008. However, Remote Server Administration Tools for Windows 7 cannot be installed on any versions of the Windows Server operating system.

Administration Tools are secure by default. The default Administration Tools configuration opens only those ports and enables only those services and firewall exceptions required for remote management to work.

Continue reading “How do I install Active Directory management tools on Windows 7?”

How to upload all files from a folder on Microsoft Azure Storage

If you are working with Microsoft Cloud, you must have had the need to move a few files (in bulk) from a folder onto the Azure Storage System.
To get started, you will need to have a Microsoft Azure Account, create a storage account, get the connection string for it.
If you are unsure about how to proceed, check out this great introductory article: How to use Blob Storage from .NET
mobile-blob-storage-account
This is a demo connection string that you will need to add to your web.config file:
<add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=srmd1983;AccountKey=JyUOu3/iv+0UMjzI/PtoHd2JKhKx4SOSSxJcsvVp95isAZH6hKpPs/AQDOPxgVXjTNGWCYCSssgiwVVun0rlWXFgJ6A==" />

In your code, import the needed Microsoft.WindowsAzure.Storage namespace. To add this DLL, install the Azure SDK package for your version of Visual Studio and add the DLL into references.

Imports Microsoft.WindowsAzure.Storage
Imports Microsoft.WindowsAzure.Storage.Auth
Imports Microsoft.WindowsAzure.Storage.Blob

Then you will need to create a method that will do all the dirty work. The files will be uploaded in a subdirectory for each folder (in this case, the subdirectory is generated by the ClientID value).

   1:  Imports Microsoft.WindowsAzure.Storage
   2:  Imports Microsoft.WindowsAzure.Storage.Auth
   3:  Imports Microsoft.WindowsAzure.Storage.Blob
   4:  Imports System.IO
   5:  Public Class Azure
   6:  Shared Function UploadAllFilesToBlob(ClientID As String) As String
   7:  Dim err As String = ""
   8:  Try
   9:  Dim storageAccount As CloudStorageAccount = CloudStorageAccount.Parse( _
  10:  ConfigurationManager.AppSettings("StorageConnectionString"))
  11:  Dim blobClient As CloudBlobClient = storageAccount.CreateCloudBlobClient()
  12:  ' // Retrieve a reference to a container. 
  13:  Dim container As CloudBlobContainer = blobClient.GetContainerReference("uploads")
  14:  '// Create the container if it doesn't already exist.
  15:  container.CreateIfNotExists()
  16:  Dim dir As CloudBlobDirectory
  17:  dir = container.GetDirectoryReference(ClientID)
  18:  '// Create or overwrite the "myblob" blob with contents from a local file.
  19:  Dim path As String = System.Configuration.ConfigurationManager.AppSettings("SavePath")
  20:  Dim di As New DirectoryInfo(path)
  21:  For Each fi As FileInfo In di.GetFiles()
  22:  Dim fileStream = System.IO.File.OpenRead(fi.FullName)
  23:  '      // Retrieve reference to a blob named the same as the uploaded file. If the file exists, it will be overwritten.
  24:  Dim blockBlob As CloudBlockBlob = dir.GetBlockBlobReference(fi.Name)
  25:  '//create or replace
  26:  fileStream.Position = 0
  27:  blockBlob.UploadFromStream(fileStream)
  28:  fileStream.Close()
  29:  fi.Delete()
  30:  Next
  31:  Catch ex As Exception
  32:  err = "Error: " + ex.Message + "<br />" + ex.StackTrace
  33:  End Try
  34:  Return err
  35:  End Function

How to search a website with Microsoft Indexing services

Providing search capabilities requires two steps:

  • First, you must create an index of the site’s contents. An index of a website is synonymous to the index in a book. If you want to read about a particular topic in the book you can quickly find the page(s) through the index, as opposed to having to read through the book’s entire contents.
  • Once an index has been created you need to be able to search through the index. Essentially, a user will enter a search string and you’ll need to find the matching, relevant results in the index, displaying them to the user.

Unfortunately, building a search engine for your site is not as straightforward and simple as we’d like. Writing your own indexer, indexing the content, and building code to search the index is definitely possible, but requires a good deal of work. Fortunately there exist a number of indexers out there that you can leverage for your site. Some indexers include commercial products like EasySearchASP.NET and Building31.Search, which are products designed to specifically search an ASP.NET website. Additionally, Microsoft provides its own indexer, Microsoft Index Services.

This article examines using Microsoft Index Services for your site’s search functionality.
With Index Services you can specify a specific group of documents or HTML pages to be indexed, and then create an ASP.NET page that can query this index.

We’ll build a simple, fast, and extensible search tool using .NET and Microsoft Indexing Services along with the Adobe IFilter plug-in, which allows MS Indexing Services to index PDF documents and display them in your search results.

Configuring Microsoft Indexing Services

The first step in creating an index for your search application is to configure Indexing Services on the IIS server that your Web application will be running. To do this you need access to the Web server itself. Open the Microsoft management console by clicking Start, then Run; type mmc and click Ok. Next, to open the Indexing Services snap-in, you must:

  • Click file,
  • Click Add/Remove Snap-In,
  • Click Add,
  • Select the Indexing Service Snap-In,
  • Click Add,
  • Click Finish,
  • Close the dialog

After following these steps you should see something akin to the screenshot below.

To create a new catalog – which is the vernacular Microsoft uses for an index – right-click on the Indexing Service node, click New and then Catalog. You then need to choose a location to store the catalog file. Once you’ve done that, expand the catalog that you just created and click on the directories icon. Right-click on the directories folder, click new directory, and add the directory or directories that contain the content that you want to search. These directories can reside anywhere that the host computer can access, virtual directories and even UNC paths (Servershare) may be used. However, each directory that is indexed must either reside physically, or be included as a virtual directory, in the root of the website that you are indexing. If a directory is specified that is not in the web root via a physical folder or virtual directory, the results will be displayed in your search, but they will return broken links.


Indexing Services will effectively index HTML, Word, and, once properly configured, PDF documents. To ensure that your required directories will be indexed you should verify that the index flag is properly set on the files and folders. You can verify this setting by right clicking on any folder or file and selecting properties. Click the “Advanced button” and make sure that the “For fast searching, allow indexing services to index this folder” checkbox is checked, as shown in the screenshot to the right.

Next, you want to set the properties of this catalog so that the HTML paths can be used, and so that Indexing Services will generate abstracts for the documents as they are indexed. To do this right-click on the catalog you just created and select Properties. On the tracking tab, you’ll need to make sure that the “WWW Server:” field is set to the website that your application will be running from. This ensures that the html paths work as they should when you get to building the front-end for the search. If you want to display a short bit of each article along with your search results, then go to the Generation tab, uncheck “inherit above settings from service,” then check generate abstracts and set the number of characters you wish to have displayed in each abstract.

If you want your search to include PDF documents, then you must install the Adobe IFilter extension. You can download this free of charge from Adobe: http://www.adobe.com/support/downloads/detail.jsp?ftpID=2611.
This plug-in is a standard windows installer and requires no additional configuration. After the plug-in has been installed, PDF documents will automatically be included in the search results as they are indexed without any user intervention or configuration required.

When you navigate to the Directories folder in the catalog that you’ve created, you may notice that there one or more directories appear in addition to the ones you added in the previous step. These are website shares added automatically by Indexing Services, they and need to be excluded from indexing if you don’t want your search to include them. To exclude these directories, you must find them in the file system via windows explorer. Next, right click the folder and choose Properties. From the dialog that appears click advanced and uncheck the box that says “For fast searching, allow index services to index this folder.” (See the screenshot above) This will exclude the folder from your search results. The configuration of indexing services is now complete.

As you can see, an index may include as little as one folder of documents or as much as an entire website or group of websites. It’s up to you to determine the breadth of the index. However, since Index Services does not crawl links like a spider, it will only catalog file system objects. Thus, the results from this search will include static files such as HTML pages, Word documents, and PDF documents, but not any dynamically generated pages. Changes made to these static documents will be picked up by Indexing Services and will very quickly be reflected in your search results.

Searching the Index

Once the index has been created, the next step is to build a search page that allows the website visitor to search through the index. To do this, you need, at minimum, a TextBox Web control for the end user to enter search terms, a Button Web control to initiate the search, and a Repeater control to display the results. The following markup shows the bare minimum markup for a simple search page:

Enter your search query:
<asp:TextBox id="txtSearch" runat="server"/>

<asp:Button
      id="btnSearch"
      runat="server"
      Text="Search"
      OnCommand="btnSearch_Click"
      CommandName="search"
/>

<hr>

<asp:Repeater id="searchResults" runat="server">
    <HeaderTemplate></HeaderTemplate>

    <ItemTemplate>
        <%# DataBinder.Eval(Container.DataItem, "File") %> <br>
        <%# DataBinder.Eval(Container.DataItem, "FileAbstract") %> <br>

    </ItemTemplate>

    <SeparatorTemplate><br></SeparatorTemplate>
</asp:Repeater>


This results page will display a list of results with a line for the document title followed by the abstract, which is generated by indexing services. Let’s take a look at the code-behind class.

In the code-behind page, an OleDbConnection is attached to the Indexing Services catalog that we set up earlier. Once connected, the catalog can be searched using a variety of query languages, including SQL syntax. You can read about each of the language options here: Query Languages for Indexing Services. For this example, I’m going to use the IS Query Language to perform a freetext search which allows for natural language search capabilities, but you can modify your search to use Boolean, phrase, or any of the query types that indexing services support.

To set up the connection to the indexing services catalog you need to set up a OleDB connection as follows: 


// create a connection object and command object, to connect the Index Server

System.Data.OleDb.OleDbConnection odbSearch = new System.Data.OleDb.OleDbConnection( "Provider="MSIDXS";Data Source="docSearch";");

System.Data.OleDb.OleDbCommand cmdSearch = new System.Data.OleDb.OleDbCommand();

// assign connection to command object cmdSearch
cmdSearch.Connection = odbSearch;
// Execute the query using freetext searching and sort by relevance ranking
//Query to search a free text string in the contents of the indexed documents in the catalog
string searchText = txtSearch.Text.Replace(“‘”,”””);

cmdSearch.CommandText = “select doctitle, filename, vpath, rank, characterization from scope() where FREETEXT(Contents, ‘”+ searchText +”‘) order by rank desc “;

The fields returned from querying the index include:

  • Doctitle: The title of document, which is the text between the <title> tags in an HTML document or the text in the title field of a word or PDF document.
  • Filename: The physical name of the file that the result was returned from.
  • Vpath: The virtual path of the file where the result was returned from. This is the field you use to specify an HTML link to the file.
  • Rank: The relevance of the returned result.
  • Characterization: The abstract for the document, usually the first 320 characters.

Tying it All Together

While there are a number of ways in which you can display the results from your search, a Repeater is likely the most efficient and gives you the greatest control on how your results are formatted. The sample application that is attached demonstrates how to bind the results of your search to a Repeater control. It also adds paging functionality to the results that will make the results easier to use, as shown in the screenshot below. The results can easily be modified to show paths to documents or display the rankings of each result.

Conclusion

This search tool is small, fast, and simple enough to deploy for searching individual folders of specific content in your intranet or Internet site. However, it can easily be used to search entire sites composed of thousands of documents. Due to the power of Microsoft Indexing Services, all that you will need to do is alter the scope of the Indexing Services catalog to include any folders you want indexed. Adding new documents to these folders or modifying the existing documents will automatically be picked up by Indexing Services. 

For more information about Indexing Services, be sure to read the following resources:

Windows Azure Security Whitepaper Released

After a lot of waiting, the Windows Azure Security Whitepaper has been released.

To download the latest release, check out the link below:

Windows Azure Network Security Whitepaper – FINAL

If you would like to keep your server/application secure, make sure you follow these guidelines and these Security Best Practices for Windows Azure Apps.

 

Windows Azure Security Layers
Windows Azure Security Layers

IIS and Windows 8 – How to Install

One of the first things Web Developers using ASP.NET will want to install on Windows 8 is IIS (Internet Information Services). Windows 8 ships with a new version of IIS, version 8, lets take a look at installing it.

Installing IIS

Keeping with Microsoft modular design of, uhm, everything these days, IIS in Windows is still an optional “Windows Feature”. To install it, press the Windows + R key combination to bring up a run box, then type appwiz.cpl and press enter.

This will open the Program and Features part of Control Panel, on the left hand side click on the “Turn Windows features on or off” link.

Now click on the Internet Information Services check box.

If you’re a developer you are going to want to expand it and explore the sub-components as well. By default it installs all the stuff needed to host a website, and you are probably going to need some of the more developer centric components as well.

After clicking OK, this dialog will appear on your screen for a while.

When its done, fire up your browser and navigate to localhost.

That’s all there is to it.

Computer Security Ethics and Privacy

Today, many people rely on computers to do homework, work, and create or store useful information. Therefore, it is important for the information on the computer to be stored and kept properly. It is also extremely important for people on computers to protect their computer from data loss, misuse, and abuse.

For example, it is crucial for businesses to keep information they have secure so that hackers can’t access the information. Home users also need to take means to make sure that their credit card numbers are secure when they are participating in online transactions.

A computer security risk is any action that could cause lost of information, software, data, processing incompatibilities, or cause damage to computer hardware,   a lot of these are planned to do damage. An intentional breach in computer security is known as a computer crime which is slightly different from a cybercrime.

A cybercrime is known as illegal acts based on the internet and is one of the FBI’s top priorities.  There are several distinct categories for people that cause cybercrimes, and they are refereed as hacker, cracker, cyberterrorist, cyberextortionist, unethical employee, script kiddie and corporate spy.  The term hacker was actually known as a good word but now it has a very negative view. A hacker is defined as someone who accesses a computer or computer network unlawfully.  They often claim that they do this to find leaks in the security of a network. The term cracker has never been associated with something positive this refers to someone how intentionally access a computer or computer network for evil reasons. It’s basically an evil hacker.  They access it with the intent of destroying, or stealing information. Both crackers and hackers are very advanced with network skills.

A cyberterrorist is someone who uses a computer network or the internet to destroy computers for political reasons.  It’s just like a regular terrorist attack because it requires highly skilled individuals, millions of dollars to implement, and years of planning. The term cyperextortionist is someone who uses emails as an offensive force. They would usually send a company a very threatening email stating that they will release some confidential information, exploit a security leak, or launch an attack that will harm a company’s network. They will request a paid amount to not proceed sort of like black mailing in a since.

An unethical employee is an employee that illegally accesses their company’s network for numerous reasons. One could be the money they can get from selling top secret information, or some may be bitter and want revenge.

A script kiddie is someone who is like a cracker because they may have the intentions of doing harm, but they usually lack the technical skills. They are usually silly teenagers that use prewritten hacking and cracking programs.

A corporate spy has extremely high computer and network skills and is hired to break into a specific computer or computer network to steal or delete data and information. Shady companies hire these type people in a practice known as corporate espionage. They do this to gain an advantage over their competition an illegal practice. Business and home users must do their best to protect or safeguard their computers from security risks.

The next part of this article will give some pointers to help protect your computer. However, one must remember that there is no one hundred percent guarantee way to protect your computer so becoming more knowledgeable about them is a must during these days. When you transfer information over a network it has a high security risk compared to information transmitted in a business network because the administrators usually take some extreme measures to help protect against security risks. Over the internet there is no powerful administrator which makes the risk a lot higher. If your not sure if your computer is vulnerable to a computer risk than you can always use some-type of online security service which is a website that checks your computer for email and Internet vulnerabilities. The company will then give some pointers on how to correct these vulnerabilities.  The Computer Emergency Response Team Coordination Center is a place that can do this.

The typical network attacks that puts computers at risk includes viruses, worms, spoofing, Trojan horses, and denial of service attacks.  Every unprotected computer is vulnerable to a computer virus which is a potentially harming computer program that infects a computer negatively and altering the way the computer operates without the user’s consent. Once the virus is in the computer it can spread throughout infecting other files and potentially damaging the operating system itself. It’s similar to a bacteria virus that infects humans because it gets into the body through small openings and can spread to other parts of the body and can cause some damage. The similarity is, the best way to avoid is preparation.

A computer worm is a program that repeatedly copies itself and is very similar to a computer virus. However the difference is that a virus needs o attach itself to an executable file and become a part of it. A computer worm doesn’t need to do that I seems copies to itself and to other networks and eats up a lot of bandwidth.

A Trojan Horse named after the famous Greek myth and is used to describe a program that secretly hides and actually looks like a legitimate program but is a fake.  A certain action usually triggers the Trojan horse, and unlike viruses and worms they don’t replicate itself. Computer viruses, worms, and Trojan horses are all classifies as malicious-logic programs which are just programs that deliberately harms a computer.  Although these are the common three there are many more variations and it would be almost impossible to list them. You know when a computer is infected by a virus, worm, or Trojan horse if one or more of these acts happen:

  • Screen shots of weird messages or pictures appear.
  • You have less available memory then you expected
  • Music or sounds plays randomly.
  • Files get corrupted
  • Programs are files don’t work properly
  • Unknown files or programs randomly appear
  • System properties fluctuate

Computer viruses, worms, and Trojan horses deliver their payload or instructions through four common ways. One, when an individual runs an infected program so if you download a lot of things you should always scan the files before executing, especially executable files. Second, is when an individual runs an infected program. Third, is when an individual bots a computer with an infected drive, so that’s why it’s important to not leave media files in your computer when you shut it down.  Fourth is when it connects an unprotected computer to a network. Today, a very common way that people get a computer virus, worm, or Trojan horse is when they open up an infected file through an email attachment. There are literally thousands of computer malicious logic programs and new one comes out by the numbers so that’s why it’s important to keep up to date with new ones that come out each day. Many websites keep track of this. There is no known method for completely protecting a computer or computer network from computer viruses, worms, and Trojan horses, but people can take several precautions to significantly reduce their chances of being infected by one of those malicious programs.

Whenever you start a computer you should have no removable media in he drives. This goes for CD, DVD, and floppy disks. When the computer starts up it tries to execute a bot sector on the drives and even if it’s unsuccessful any given various on the bot sector can infect the computer’s hard disk. If you must start the computer for a particular reason, such as the hard disk fails and you are trying to reformat the drive make sure that the disk is not infected.