Older blog entries for sness (starting at number 5701)

Stop working (so hard) — I.M.H.O. — Medium

Stop working (so hard) — I.M.H.O. — Medium: "The idea that, without “hustle,” without throwing away nights and weekends, without putting your life on hold for your work, you’ll somehow be more successful, more productive, is ridiculous to me, yet continues to be pushed by participants in our industry left and right. This is, quite simply, insane.

"

'via Blog this'

Syndicated 2013-07-16 15:28:00 from sness

3.2. Support Vector Machines — scikit-learn 0.13.1 documentation

3.2. Support Vector Machines — scikit-learn 0.13.1 documentation: "In problems where it is desired to give more importance to certain classes or certain individual samples keywords class_weight and sample_weight can be used."

'via Blog this'

Syndicated 2013-07-13 04:54:00 from sness

LIBSVM FAQ

LIBSVM FAQ: "Q: Does libsvm have special treatments for linear SVM?
No, libsvm solves linear/nonlinear SVMs by the same way. Some tricks may save training/testing time if the linear kernel is used, so libsvm is NOT particularly efficient for linear SVM, especially when C is large and the number of data is much larger than the number of attributes. You can either

Use small C only. We have shown in the following paper that after C is larger than a certain threshold, the decision function is the same.
S. S. Keerthi and C.-J. Lin. Asymptotic behaviors of support vector machines with Gaussian kernel . Neural Computation, 15(2003), 1667-1689.

Check liblinear, which is designed for large-scale linear classification.
Please also see our SVM guide on the discussion of using RBF and linear kernels."

'via Blog this'

Syndicated 2013-07-13 04:44:00 from sness

Soft margin classification

Soft margin classification: "The optimization problem is then trading off how fat it can make the margin versus how many points have to be moved around to allow this margin. The margin can be less than 1 for a point by setting , but then one pays a penalty of in the minimization for having done that."

'via Blog this'

Syndicated 2013-07-13 04:38:00 from sness

Support vector machine - Wikipedia, the free encyclopedia

Support vector machine - Wikipedia, the free encyclopedia: ". To keep the computational load reasonable, the mappings used by SVM schemes are designed to ensure that dot products may be computed easily in terms of the variables in the original space, by defining them in terms of a kernel function selected to suit the problem."

'via Blog this'

Syndicated 2013-07-13 04:37:00 from sness

opencv - How to speed up svm.predict? - Stack Overflow

opencv - How to speed up svm.predict? - Stack Overflow: "The prediction algorithm for an SVM takes O(nSV * f) time, where nSV is the number of support vectors and f is the number of features. The number of support vectors can be reduced by training with stronger regularization, i.e. by increasing the hyperparameter C (possibly at a cost in predictive accuracy)."

'via Blog this'

Syndicated 2013-07-12 17:44:00 from sness

QuerySet API reference | Django documentation | Django

QuerySet API reference | Django documentation | Django: "In some complex data-modeling situations, your models might contain a lot of fields, some of which could contain a lot of data (for example, text fields), or require expensive processing to convert them to Python objects. If you are using the results of a queryset in some situation where you know you don’t need those particular fields, you can tell Django not to retrieve them from the database."

'via Blog this'

Syndicated 2013-07-12 04:47:00 from sness

Support Vector Machines: Parameters

Support Vector Machines: Parameters: ""However, it is critical here, as in any regularization scheme, that a proper value is chosen for C, the penalty factor. If it is too large, we have a high penalty for nonseparable points and we may store many support vectors and overfit. If it is too small, we may have underfitting."
Alpaydin (2004), page 224"

'via Blog this'

Syndicated 2013-07-11 21:17:00 from sness

5692 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!