← Back to team overview

cairo-dock-team team mailing list archive

[Merge] lp:~eduardo-mucelli/cairo-dock-plug-ins-extras/WebSearch into lp:cairo-dock-plug-ins-extras

 

Eduardo Mucelli Rezende Oliveira has proposed merging lp:~eduardo-mucelli/cairo-dock-plug-ins-extras/WebSearch into lp:cairo-dock-plug-ins-extras.

Requested reviews:
  Cairo-Dock Team (cairo-dock-team)

For more details, see:
https://code.launchpad.net/~eduardo-mucelli/cairo-dock-plug-ins-extras/WebSearch/+merge/82587

Simplifying the dependencies. Fixing Wikipedia search.
-- 
https://code.launchpad.net/~eduardo-mucelli/cairo-dock-plug-ins-extras/WebSearch/+merge/82587
Your team Cairo-Dock Team is requested to review the proposed merge of lp:~eduardo-mucelli/cairo-dock-plug-ins-extras/WebSearch into lp:cairo-dock-plug-ins-extras.
=== modified file 'WebSearch/Changelog.txt'
--- WebSearch/Changelog.txt	2010-07-13 16:02:17 +0000
+++ WebSearch/Changelog.txt	2011-11-17 18:56:26 +0000
@@ -1,3 +1,4 @@
+1.4.2: (July/13/2010: Simplifying the dependencies. Fixing Wikipedia search.
 1.4.0: (July/13/2010: WebSearch now keeps a history of recently searched terms.
 1.3.1: (June/29/2010: URL encoding was not being done. Fixed it.
 1.3.0: (May/28/2010): WebSearch now fetch results from Digg. Changing the thumb download directory handling.

=== modified file 'WebSearch/README'
--- WebSearch/README	2010-04-20 03:16:10 +0000
+++ WebSearch/README	2011-11-17 18:56:26 +0000
@@ -1,21 +1,9 @@
 In order to use WebSearch applet it is necessary to install Ruby, Rubygems and some gems.
 
-# Installing requirements
-
-[+] Ruby 1.8: sudo apt-get install ruby1.8-dev ruby1.8 ri1.8 rdoc1.8 irb1.8 libreadline-ruby1.8 libruby1.8
-[+] Rubygems: Instructions are neatly described in http://rubygems.org/pages/download
-[+] Gem parseconfig: (sudo) gem install parseconfig
-[+] Lib XSLT e XML required by Nokogiri Gem: (sudo) apt-get install libxslt-dev libxml2-dev
-[+] Gem Nokogiri: (sudo) gem install nokogiri
-[+] Gem Launchy: (sudo) gem install launchy
-
-[+] Ruby-Dbus
-  [+] Download [http://github.com/downloads/mvidner/ruby-dbus/ruby-dbus-0.3.0.tgz] and decompress the tarball
-  [+] Enter in the generated directory use the following commands:
-      $ ruby setup.rb config
-      $ ruby setup.rb setup
-     ($ sudo)
-      # ruby setup.rb install
+# Installing the dependencies
+
+[1st] sudo apt-get install ruby rubygems libxslt-dev libxml2-dev
+[2nd] sudo gem install parseconfig, addressable, launchy, nokogiri, ruby-dbus
 
 # WebSearch installation process
 

=== modified file 'WebSearch/WebSearch.conf'
--- WebSearch/WebSearch.conf	2011-11-06 01:58:42 +0000
+++ WebSearch/WebSearch.conf	2011-11-17 18:56:26 +0000
@@ -1,4 +1,4 @@
-#1.4.1
+#1.4.2
 
 #[gtk-about]
 [Icon]

=== modified file 'WebSearch/auto-load.conf'
--- WebSearch/auto-load.conf	2011-05-19 23:57:32 +0000
+++ WebSearch/auto-load.conf	2011-11-17 18:56:26 +0000
@@ -10,4 +10,4 @@
 category = 3
 
 # Version of the applet; change it everytime you change something in the config file. Don't forget to update the version both in this file and in the config file.
-version = 1.4.1
+version = 1.4.2

=== modified file 'WebSearch/lib/Wikipedia.rb'
--- WebSearch/lib/Wikipedia.rb	2010-07-13 16:02:17 +0000
+++ WebSearch/lib/Wikipedia.rb	2011-11-17 18:56:26 +0000
@@ -18,12 +18,13 @@
 	
 	# Fetch links from english Wikipedia. It is necessary to set user agent, or the connection is Forbidden (403)
 	def retrieve_links(query, offset = 0)
-		wikipedia = Nokogiri::HTML(open(URI.encode("#{self.query_url}#{query}&offset=#{offset}&limit=#{self.number_of_fetched_links}", 'User-Agent' => 'ruby')))
-		self.stats = retrieve_webshots_result_wikipedia(wikipedia, query)
-		(wikipedia/"ul[@class='mw-search-results']/li/a").each do |res|
-			url = res['href']
+		puts "Query #{self.query_url}#{query}&offset=#{offset}&limit=#{self.number_of_fetched_links}"
+		wikipedia = Nokogiri::HTML(open("#{self.query_url}#{query}&offset=#{offset}&limit=#{self.number_of_fetched_links}", 'User-Agent' => 'ruby'))
+		#self.stats = retrieve_webshots_result_wikipedia(wikipedia, query)
+		(wikipedia/"ul[@class='mw-search-results']/li/div/a").each do |res|
+			url = "#{self.base_url}#{res['href']}"
 			description = res['title']
-			self.links << Link.new("#{self.query_url}#{url}", description)
+			self.links << Link.new(url, description)
 		end
 		self.links
 	end


Follow ups