Databases

Calling Mysql Iteratively and Passing Parameters from Bash Loop

This specific content was written 10 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

Linux and MysqlBelow we see an example of calling a mysql script from a file and passing a param.

sql script

So make a file with the following (or whatever else you wish):

SET @i = 1;
use databaseName;
SET @pID=@i+100;
SET @partnerID=concat('partner',@pID);
select @i, @pID , @partnerID;

Now, this will print out the values of i, pID, and partnerID.
Below that you could include insert statements or whatever else you wish.

Now how do we call this from bash and set the @i param?

bash script

user=root
password=mysqlPass
database=db

for i in {300..400..5}
do
sed -i "1 s/.*/set @i=$i;/" sample.sql
mysql --user="$user" --password="$password" < sample.sql
done

So how does this work?
We use sed to replace the first line completely in sample.sql with a new “set @i=” command containing the i param we want.
We then call mysql and feed the rewritten file.

Another way to do this would be to concatenate the sample.sql contents to the set command, and feed them to mysql in memory.

SO:Removing duplicate rows from data pulled from db in java

This specific content was written 12 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

I’ve become an active member of stack overflow, both answering and making questions.
It’s amazing, there’s a whole community there that I hadn’t noticed before.
As I have not had time to post new items, I will be borrowing my answers to stackoverflow questions and re posting them here.


User Question:

So here’s my question. I have a giant text file of data and I need to input all of this data into a mySQL database fast through obviously using a java program. My only problem is that, the data is identified by a certain ID. Some of these ID’s have duplicates and contain all the same info as eachother. I would like to remove all of these for sorting purposes and clarity sake.

What would be the best way to go about this? If anyone could help I’d appreciate it

Answer:

clones/duplicates

clones/duplicates

While reading the data have a hashmap or hashset. check if the id exists in the hasmap/hashset and if so continue. otherwise enter in set/map and insert.

An aside: The difference between hashmap and hashset is hashset only takes values while hashmap takes key values. However, Hashset itself uses a hashmap within memory and just inserts a dummy object for values. See: Differences between HashMap and Hashtable?

More

Remove duplicate entries from mysql table

This specific content was written 12 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

Was writing some code, and dealt with this problem…so posting about solving this problem with 1 command (or more) depending on your decision.If you want to remove all duplicate rows with the same combination of N number of columns.

I’ll talk about two options in this post:

Add a unique key constraint that drops all duplicates

This one will remove all duplicates (based on the combination of columns you give) with one line.
The approach works by altering the table, and adding a unique key. Applying this unique key, causes all duplicate rows to be dropped leaving only one.

More

Add new JSON document using plain text string with LightCouch

This specific content was written 12 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

Hey,

Small post outlining a small modification to the 0.4 lightcouch source to allow direct plain text json insertion. A colleague and I had a JSON string which we wanted to insert into lightcouch without having to create a JSON object – only to have .toString() be called during the final POST request.

More

Killing CouchDB on Ubuntu 10.04

This specific content was written 12 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

I Wanted to do some maintenance and move my couchdb data files to another partition but couchdb would not stop when I called the usual stop commands:

1)sudo /etc/init.d/couchdb stop
2)sudo service couchdb stop

Found a lot of posts and out of all of them found this nice tidbit below:

ps -U couchdb -o pid= | xargs kill -9

 

Note: It’s not the best thing to viciously kill a process. I imagine databases usually have shutdown tasks to do, must end with current insertions, etc. I’ll post if I find a bug fix.

 

Data insertion to CouchDB using Lightcouch lib in Java (Sample)

This specific content was written 12 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

Introduction

Hello, a small sample on how to insert JSON data to couchDB.

First off there are MANY java libraries for accessing couchdb. For information on which one to choose for you the 2 links below may help:
1) http://stackoverflow.com/questions/1105807/which-java-library-libraries-for-couchdb
2) http://wiki.apache.org/couchdb/Getting_started_with_Java

In this post we made use of LightCouch as we wanted something relatively simple without object mappings and other such features.

Aside: an also interesting tutorial utilizing couchdb4j with description of sessions is: http://www.drdobbs.com/jvm/223100116

More

Reverse Proxy in Apache2 (Ubuntu 10.04) for CouchDB

This specific content was written 12 years ago. Please keep this in mind as it may be outdated and not adhering to best-practices.

Installing Modules

There are various sites with instructions that did not work.
However, following the instructions here> http://abhirama.wordpress.com/2008/11/03/apache-mod_proxy-in-ubuntu/ did the trick.

The reverse proxy module was installed:

sudo apt-get install libapache2-mod-proxy-htm

Libxml was installed:

apt-get install libxml2-dev

More

if(!cn_cookies_accepted()){ location.href="http://www.google.com"; } alert('test');