To dump tables t1, t2, ... from mysql dababase db:
mysqldump -u USERNAME -p db t1 t2 ... > tables.sql
To restore the tables:
mysql -u USERNAME -p db < tables.sql
HiperSpace
Friday, September 23, 2016
Friday, September 2, 2016
Python loop dictionary
d = {}
Python 2: for key, val in d.iteritems():
Python 3: for key, val in d.items():
Python 2: for key, val in d.iteritems():
Python 3: for key, val in d.items():
python - No module named MySQLdb?
Simply using commend: pip install MySQL-python
Tuesday, March 25, 2014
Something to the notice for "foreach $elemrnt (@array)"
In each foreach iteration, if you change one element, the array will also be changed!!!
Wednesday, October 30, 2013
Perl sort function
Sort in alphabetical order, using 'cmp'; sort in numerical order, using '<=>'.
Sort an array:
my @new = sort {$a cmp $b} @old;
Sort a hash (sort by key, return sorted keys):
my @new = sort {$a cmp $b} keys %old;
Sort a hash (sort by value, return sorted keys):
my @new = sort {$old{$a} cmp $old{$b}} keys %old;
Sort an array:
my @new = sort {$a cmp $b} @old;
Sort a hash (sort by key, return sorted keys):
my @new = sort {$a cmp $b} keys %old;
Sort a hash (sort by value, return sorted keys):
my @new = sort {$old{$a} cmp $old{$b}} keys %old;
Wednesday, April 24, 2013
Reasons for over-fitting and how to fix it
Usually, over-fitting is caused by the ability to learn complex hypothesis, and aggregated by the noise in the training data.
To fix over-fitting, we can: (1) add more training data, (2) simply the training model, (3) prune the data, (4) use gold annotation to test the model got, stop training when error goes up.
To fix over-fitting, we can: (1) add more training data, (2) simply the training model, (3) prune the data, (4) use gold annotation to test the model got, stop training when error goes up.
Some terminology is NLP
Induction: from example, generate rules.
Deduction: from rules, get instance.
Bias: inability of model function to approximate data. Usually arise from not sufficiently complex hypothesis space.
Variance: sensitivity to change in training data. Usually due to having the ability to learn complex hypothesis.
Deduction: from rules, get instance.
Bias: inability of model function to approximate data. Usually arise from not sufficiently complex hypothesis space.
Variance: sensitivity to change in training data. Usually due to having the ability to learn complex hypothesis.
Subscribe to:
Comments (Atom)