Showing posts with label Proxy. Show all posts
Showing posts with label Proxy. Show all posts

Mar 5, 2010

Weatherbug Proxy Pass through

So I managed to get the proxy pass through working and returning data for iteration 2 of the individual project. Since it took me a while to figure it out I thought that I would pass on my experience in case others are running into the same difficulties. I used the WeatherBug API, and the documentation can be found at WeatherBug.

The first thing I did was to use the createRequest method that was provided to us in class.



Next I had to create the getWeather function to call the proxy. I handed it a zip as a parameter so it could find the weather for a location.




Next I created the proxy script. I did this a bit differently than professor Drake showed us but I was able to find quite a bit of documentation and sample code out there to choose from. This one first takes the zip code that I passed to it and puts it in the zip variable. It then takes the cll url and puts it into the $url vaiable. Then I append the zip to the end of the $url variable by concatenating the strings $fullUrl = $url.$zip. The reset is the call to the API and thn it puts the data into the $data variable.




I then just echo the $data variable and it then sends that data to the displayWeather() function in my javascript.




This function then takes the return data variable and then displays it in my web page in the div id=weather tag. While it took me a while to figure out how to get this to work, it was pretty easy to recreate for my other APIs. I literaly just needed to change all of the global variables and then just call the different URL. I was able to use this same code with 3 other APIs for the group project.

Feb 18, 2010

Proxy-pass through

I decided to write a blog about the Proxy-pass through because I know that it can be a little difficult to understand the first time you look at it. I think the most important thing to know is that it really isn't too complicated. Based on Dr. Drakes PowerPoint I will explain some steps to follow.

To get started create a basic html web page like this. Remember to include the script type in the head. This is shown in slide 5 in Dr. Drakes PowerPoint. Just remember that for iteration 2 we will need to have an event handler like a button onClick and a div for display.


Next create a JavaScript file and name it ajax and save it as a js file. All you have to do is copy slides 6,7, and 8 in Dr. Drakes PowerPoint like below. Note that for iteration 2 we won't use the window.onload because we will have an event handler. You will have to change the function displayDetails beacuse we can't use an alert box. Also, in the function getDetails is where we call the proxy file.


Then all you need to do is create one more file and name it proxy and save it as a php document. Just copy slides 9-13 in Dr. Drakes PowerPoint. In the highlighted area is where we will define a hostname. For iteration 2 we will need to insert our own hostname from a web service we have chosen. Some API's will require a KEY and some parameters along with the hostname.



This is only a way to call on a web service using Proxy-pass through and to return the information as xml. This is the requirement for iteration 2. As we continue in the semester we will take this xml and parse the information to display only the tags we need. I hope this will be helpful and make iteration 2 a little easier to complete.

Nov 22, 2009

Sending a text string with %20 to your Proxy Pass-Through

I thought I had everything working after my last post about “Building your Proxy Pass-Through URL”. As it turns out everything works fine till you have, in my case, a space between City spellings. I remembered seeing a post about that problem so I logged in and found Jassin's post about Parsing Errors. That gave me a clue about what I needed to do to make my stuff work but I really didn’t want to change my PHP and it looked like a lot of work to replace spaces with %20 in PHP. This is what I did instead:

I used a switch statement because I needed to map our Trail Map City to the Active.com City and define an event_City variable depending on which city is selected.

switch(event_City)
{
case 'tallahassee':
var event_City = "Tallahassee" + "%20" + "-" + "%20" + "Thomasville";
no_event_City = "Tallahassee - Thomasville";
break;
}


You can see that I took Jassin's advice and used it to build my City strings with %20 instead of spaces. However after doing that I still couldn’t get it to work and, it seemed to give me the same results as passing my city string in with spaces.

After playing with that for a while I decided to try something Josh used with the Yahoo Weather API. This function, encodeURIComponent(), encodes special characters as well as these characters: , / ? : @ & = + $ #.

I really wasn’t sure if this would work because it seemed that I would need to decode my string on the server end. Well, you can believe that I was not only surprised but extremely happy when I used encodeURIComponent(event_City) and it worked!!! I didn’t have to change anything with my PHP proxy pass-through.


This is my original url: var url = 'active.php?path=' + event_City;


And, this is my new url: var url = 'active.php?path=' + encodeURIComponent(event_City);

So all I really needed to do to get this to work was to build the string with %20 instead of spaces and send it to my PHP with encodeURIComponent(event_City).

Nov 15, 2009

Group Iteration 2

Well we made it through iteration 2 and are now working to complete iteration 3. For iteration 2 I used buttons with onClick event handlers to make requests using the proxy pass through to retrieve web services for the Wine.com API and the Blog API. Also, I incorporated the Google Map into our web page.


Right now I am still working on taking this raw xml data and tyring to parse it into useful information. I am having a little difficullty figuring out some of this. We also need to add in one more API into our page.

Many of the class had great suggestions for adding another API. I am completely open to all suggestions and advice. Hopefully this will all come together over the next four weeks and the project will be ready for iteration 3.

Nov 14, 2009

PHP Proxy Pass-Through using several keywords

While working on our group project, we had some problems with our PHP Proxy when we handed over keywords, seperated by spaces. The result of the CURL request was an error that Google Translate was not able to process the request.

To solve this problem, we tried it with escape(variable), or manually by exchanging the space by "%20" in the script. After some research we found out that the proxy interprets the "%20" to space. So we had to find a solution within the proxyfile, and here it is:

$rk = $_REQUEST['rk'];
$token = strtok($rk, " "); //tries to split $rk into several parts
// by cutting everywhere where a space is used as delimiter
$test=""; // the new string that will be created
$test1="%20"; // the new delimiter
while ($token != false) //iteration over the parts
{
$test=$test.$token.$test1; // value assignment to $test
$token = strtok(" "); //next part of $rk will be assigned to $token
}
$rk = $test;



We use the PHP function strtok(string,delimiter) do initialize the splitting process. With strtok(delimiter) we will get the next part of the string. This was the only way that allowed us the replacements of spaces with its entity.
I guess some of you could use this.

Oct 29, 2009

Charles Proxy





I want to make everyone aware of another great tool for your coding toolbox,Charles Web Debugging Proxy. Charles Proxy is somewhat similar to Firebug which was mentioned in a previous post by Colin. As stated on the Charles Proxy website http://www.charlesproxy.com/, "Charles is an HTTP proxy / HTTP monitor / Reverse Proxy that enables a developer to view all of the HTTP and SSL / HTTPS traffic between their machine and the Internet. This includes requests, responses and the HTTP headers (which contain the cookies and caching information)."


Some Feature of Charles Proxy:


Browser Agnostic
For all you die hard browser fanatics out there (insert Matt Mager's name here) Charles Proxy works out of the box with IE, Firefox (Need to download an extension), and Opera

Operating System Agnostic
Runs on Windows, MAC OS, and LINUX

SSL Proxying - View SSL requests and responses in plain text

bandwidth Throttling - Simulates slower internet connections including latency
This is a great feature when you want to test out your web development with the bandwidth simulation of one of your target users

AJAX Debugging - Supports XML and JSON natively so you can View XML and JSON requests and responses as a tree or text. This feature would have come in handy when I was having issues retrieving JSON data for the second iteration of my personal project

Breakpoints to intercept and edit requests or responses -
You are able to insert a break point and edit requests to test different inputs

Validate recorded HTML, CSS, and RSS/atom responses using the W3C validator

Charles Proxy comes with a 30 day trial where at the end of the trial you will have to purchase for $50.00. From a business standpoint the $400 site license or $700 multi-site license seem the best way to go.

Just to reiterate Charles Proxy supports AJAX Debugging so can youcan so you can View XML and JSON requests and responses as a tree or text.
Enjoy!!!

Oct 27, 2009

Our group-project -> Buy-lingu.al

After I posted the main secrets of our group work, I will post something about the project and our solved and unsolved problems.
We started by working parallel. One part of the group designed the website layout. The others researched how to get the api's running.
The first idea was to use eBay’s and Google’s JavaScript files. These display the results in their div-tags. We implemented them and had them running. At Iteration I we presented a website that translated a keyword and, after checking an alert box, searches eBay and displays the results on our website.
The next step was to understand the eBay JavaScript files to hook into that part with our retranslation. We found a possibility to get the title of the eBay items and give them to the retranslation part of our JavaScript file and integrate the translated title back into the result page.
We thought we were almost done, only fixing some bugs:
- removing the alert Box that shows the translation result causes that the scripts work parallel: eBay is searched before we had a result from google translate.
- removing the alert Box after the retranslation causes that the results were displayed without title.
These problems showed us the main problem: The different functions in the script do not wait for each other. We tried to find the solution in the scripts but we were not able to identify our problem.
When we got to the class when we talked about proxies, we researched the eBay and Google developer sites again and found the already posted solutions.
We threw our almost working code over and started again using the proxies.

Our actual code showed us that, in JavaScript, a sequence of functions are not running successively but parallel. We realized that the second function has to be started at the end of the first function.
Maybe somebody knows another way to assure that function b starts only after function a has ended?

Another problem that we have is to get the Amazon api running. If somebody has experience how to handle it, please let us know!

Oct 21, 2009

Proxy

Well I was wondering what to write and figured out that I could mention something about Proxy servers. I am sure most of you know about it already but still I would like to put out some information.

In computer networks, a proxy server is a server (a computer system or an application program) that acts as an intermediary for requests from clients seeking resources from other servers. A client connects to the proxy server, requesting some service, such as a file, connection, web page, or other resource, available from a different server. The proxy server evaluates the request according to its filtering rules. For example, it may filter traffic by IP address or protocol. If the request is validated by the filter, the proxy provides the resource by connecting to the relevant server and requesting the service on behalf of the client. A proxy server may optionally alter the client's request or the server's response, and sometimes it may serve the request without contacting the specified server. In this case, it 'caches' responses from the remote server, and returns subsequent requests for the same content directly.

A proxy server has many potential purposes, including:

  • To keep machines behind it anonymous (mainly for security).[1]
  • To speed up access to resources (using caching). Web proxies are commonly used to cache web pages from a web server.[2]
  • To apply access policy to network services or content, e.g. to block undesired sites.
  • To log / audit usage, i.e. to provide company employee Internet usage reporting.
  • To scan transmitted content before delivery for malware.
  • To scan outbound content, e.g. for data leak protection.

A proxy server that passes requests and replies unmodified is usually called a gateway or sometimes tunneling proxy.

A proxy server can be placed in the user's local computer or at various points between the user and the destination servers on the Internet.

A reverse proxy is a (usually) Internet-facing proxy used as a front-end to control and protect access to a server on a private network, commonly also performing tasks such as load-balancing, authentication, decryption or caching.

Web proxy

A proxy that focuses on World Wide Web traffic is called a "web proxy". The most common use of a web proxy is to serve as a web cache. Most proxy programs (e.g. Squid) provide a means to deny access to certain URLs in a blacklist, thus providing content filtering. This is often used in a corporate, educational or library environment, and anywhere else where content filtering is desired. Some web proxies reformat web pages for a specific purpose or audience (e.g., cell phones and PDAs).

AOL dialup customers used to have their requests routed through an extensible proxy that 'thinned' or reduced the detail in JPEG pictures. This sped up performance but caused problems, either when more resolution was needed or when the thinning program produced incorrect results. This is why in the early days of the web many web pages would contain a link saying "AOL Users Click Here" to bypass the web proxy and to avoid the bugs in the thinning software.