Tuesday, November 29, 2011

An idea: an anti-framework for science

So here's my idea: an anti-framework. I'm interested in the idea of building a framework, because it would be nice to provide a nice, standard set of libraries and modules to work in (much like the standard library, but extended to cover the scientific domain). However, the problem with building frameworks is basically the competing standards problem. 

So here's my new idea. An anti-framework, build out of a relatively small number of de facto standard scientific components for Python (like numpy, scipy, matplotlib, I imagine there are some others) and call it something zingy. Any framework or environment which passes its unit tests in the anti-framework is then "zingy-name-compatible". You don't run with a zingy-name framework, because lots of people need more than what's in the standard framework for all sorts of reasons. It's perfectly okay for some-other-framework to extend zingy-name-framework, so long as it remains backwards-compatible with zingy-name framework.

Then, anyone building a relevant application, framework, or whatever can be zingy-compatible if they will support any tool written only using the zingy core libraries.

Wednesday, September 7, 2011

"Inspiration is for amateurs; the rest of us just show up and get to work."

I came across this fantastic quote today, capturing something which I've always found to be true. I think I can now add this as a second item to the list of things I find to be true in the workplace. The first is that the relentless pursuit of simplicity is the only weapon we have against the complexity of what we are trying to achieve.

“The advice I like to give young artists, or really anybody who’ll listen to me, is not to wait around for inspiration. Inspiration is for amateurs; the rest of us just show up and get to work. If you wait around for the clouds to part and a bolt of lightning to strike you in the brain, you are not going to make an awful lot of work. All the best ideas come out of the process; they come out of the work itself. Things occur to you. If you’re sitting around trying to dream up a great art idea, you can sit there a long time before anything happens. But if you just get to work, something will occur to you and something else will occur to you and somthing else that you reject will push you in another direction. Inspiration is absolutely unnecessary and somehow deceptive. You feel like you need this great idea before you can get down to work, and I find that’s almost never the case.” ~ Chuck Close

Tuesday, July 26, 2011

Debug my tutorials (and try something new)


As I've mentioned before, I'm working on a new project for helping collect benchmarks for Python programs. To start with, only CPU performance is measured since that's the easiest data to collect, but I'd like to collect other metrics eventually. The wiki has three fairly thorough tutorials, and some more under construction. The installation of the package with easy_install seems to be working okay, and I believe all the tutorials have correctly functioning code snippets.

If you have an interest in collecting performance data on your Python program, maybe you would be happy to work through the tutorials and let me know if you come across any bugs in benchmarker.py, or in the tutorials. If you have any questions that you would like to see addressed in the tutorials, please also let me know!

If you want to cut straight to the action, you can install benchmarker.py with

easy_install benchmarker.py

(although I'd suggest doing this in a virtualenv for the time being). Please let me know if that doesn't work -- this whole exercise will be somewhat undercut if others have trouble installing the package!

Thanks very much to those who have already helped identify bugs and issues, and contribute to lifting the standard of this package.


Sunday, May 8, 2011

Help testing my new package-in-progress

Could some helpful Pythonista please try:

easy_install benchmarker.py

and let me know if it installs for you? If it installs, here's a simple script to use it and see if it works... this script is a pretty thin usage of the package but proves it works in a basic sense:

>>> import bench
>>> from bench.benchmarker import benchmark
>>> @benchmark()
... def time_me():
...    for _ in range(100):
...       pass
>>> time_me()
>>> bench.benchmarker.print_stats()
Sat May  7 22:06:06 2011    /tmp/tmp.pstats

         300 function calls in 0.002 CPU seconds

   Random listing order was used

   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
      100    0.000    0.000    0.000    0.000 {method 'disable' of '_lsprof.Profiler' objects}
      100    0.001    0.000    0.002    0.000 :1(foo)
      100    0.001    0.000    0.001    0.000 {range}


Sunday, January 23, 2011

python-twitter-tools wins!

(1) install python-twitter-tools
(2) Register an app with twitter to get oath credentials
(3) Run this code ftw!
(4) You'll need to set those four variables to the ones you got in step 2 for this to work of course

python-twitter-tools has the most natural twitter API I've seen. It's really beautiful. The namespace follows the restful API urls, so you don't need any documentation other than what's on the twitter dev website. It really does Just Work.

import twitter

if __name__ == "__main__":

    oa = twitter.oauth.OAuth(access_key, access_secret, consumer_key, consumer_secret)
    tw = twitter.Twitter(auth=oa)
    followers = tw.statuses.followers()
    print "You have", len(followers), "followers"
    for f in followers:
        print f['name']

Saturday, January 22, 2011

Three concentric spherical surfaces, with a randomly-connected graph overlaid on each one, spinning.

Three concentric spherical surfaces, with a randomly-connected graph overlaid on each one, spinning. http://www.youtube.com/watch?v=bXOTCZXkUjk ... born of an idea I wanted to explore. Python + Mayavi = rapid gratification! Thought I'd post the code for anyone who's interested. I'm trialling the use of the Enthought Python Distribution, it certainly provides a nice technology stack to work with. All the components could be installed separately, but I'm very happy to have something where things "just work" (at least, so far).

from enthought.mayavi import mlab
from random import random
from math import sin
from math import cos
import math

radius = 1
azimuth = 0
inclination = 0

x, y = 0, 0

RAD1 = 1
RAD2 = 2
RAD3 = 3

def coord_from_rai((r, a, i)):
    x = round(r * sin(i) * cos(a), 4)
    y = round(r * sin(i) * sin(a), 4)
    z = round(r * cos(i), 4)
    return ((x,y,z))

def rai_from_coord((x, y, z)):
    rad = math.sqrt((x**2 + y**2 + z**2))
    azimuth = math.atan(y/max(x,0.00000000000000001))  
    inclination = math.acos(z/rad)  
    return((rad, azimuth, inclination))

def test_point_spherical_conversions():
    points = [(1.,0.,0.),
              (0., 1., 0.),
              (0., 0., 1.),
              (1., 1., 1.),
    for point in points:
            assert coord_from_rai(rai_from_coord(point)) == point
            print "Original Point: ", point
            print "Rad, Az, Inc: ", rai_from_coord(point)
            print "Converted Coord: ", coord_from_rai(rai_from_coord(point))

def frange(start, stop, inc):
    stop = stop * 100
    for i in range(start, stop, inc * 100):
        return float(i) / 100.  

def points_for_line_arc(point1, point2, coords):
    rad1, azimuth1, inclination1 = point1
    rad2, azimuth2, inclination2 = point2
    delta_rad = (rad2 - rad1) / 15.
    delta_az = (azimuth2 - azimuth1) / 15.
    delta_inc = (inclination2 - inclination1) / 15.
    line = []
    for i in range(1, 15):
        (rad, az, inc) = line[-1]      
        rai = (rad + delta_rad, az + delta_az, inc + delta_inc)
    if line[-1] != point2:
    return line

def get_unique_random_point(radius, coords):
    azimuth = 2 * math.pi * random()
    inclination = 2 * math.pi * random()
    return ((radius, azimuth, inclination))
def plot_randomly_connected(radius, count, color=(1., 1., 1.)):
    Plot points on a sphere distance from the origin
    Points will be connected by an arc line with 70% probability, with 5
    subpoints on the arc line

    coords = []
    for _ in range(0, count):
        coord = get_unique_random_point(radius, coords)
        if len(coords) > 1 and random() > .001:
            line = coords[-2:]
            line = points_for_line_arc(line[0], line[1], coords)
            cartesianLine = [coord_from_rai(point) for point in line]
            (x, y, z) = zip(*cartesianLine)
            mlab.plot3d(x, y, z, opacity=0.05, tube_radius=0.025, color=color)
    cartesianPoints = [coord_from_rai(point) for point in coords]
    (x, y, z) = zip(*cartesianPoints)
    mlab.points3d(x, y, z, color=color, opacity=0.8/radius, scale_factor =1./(2*radius))

plot_randomly_connected(1, 4, color=(.7,.2,.2))
plot_randomly_connected(2, 16, color=(.2,.7,.2))
plot_randomly_connected(3, 32, color=(.2,.2,.7))


from enthought.mayavi import mlab
def anim():
    f = mlab.gcf()
    while 1:

a = anim() # Starts the animation.