Book Improvement

You can discuss all aspects of programming and technical matters here.

Moderators: Harvey Williamson, Watchman

Post Reply
chrislipa
Posts: 1
Joined: Wed Oct 01, 2008 10:58 pm

Book Improvement

Post by chrislipa »

Maybe this is a stupid question, but is there some great impediment to a computing project that tries to develop the opening book? The fact that the problem is so embarrassingly parallelizable makes me think that it would be ideal for distributed computing. I'm sure that someone else must have thought of this idea. Is this being done currently, or is anyone trying to?
User avatar
Tony
Member
Posts: 11
Joined: Fri Oct 12, 2007 9:48 am

Re: Book Improvement

Post by Tony »

chrislipa wrote:Maybe this is a stupid question, but is there some great impediment to a computing project that tries to develop the opening book? The fact that the problem is so embarrassingly parallelizable makes me think that it would be ideal for distributed computing. I'm sure that someone else must have thought of this idea. Is this being done currently, or is anyone trying to?
I've been working on this for a couple of years, but the same questions comes back every time. How should one do that ? ( Without talking a couple of decades )
Analyzing 1 M positions ( not very much) for 10 seconds (not much either) takes over 100 cpudays.

Tony
User avatar
Dylan Sharp
Senior Member
Posts: 2431
Joined: Fri Aug 10, 2007 12:07 am

Re: Book Improvement

Post by Dylan Sharp »

Tony wrote:Analyzing 1 M positions ( not very much) for 10 seconds (not much either) takes over 100 cpudays.

Tony
Huh, you put 100 CPUs to do the job and do it in one day? If a very popular website solicited the help of their users to the task, they'd help and the task would be done faster.

The CAP project tried to give reliable computer evaluations to the most important openings and they were successful to some extent, the problem was that once a new stronger program comes out their evaluations seem to become obsolete. For instance, Rybka 3 discovered that a position that had a CAP evaluation that favored black in reality had black getting mated, but the engine they used couldn't see it.
User avatar
Tony
Member
Posts: 11
Joined: Fri Oct 12, 2007 9:48 am

Re: Book Improvement

Post by Tony »

Dylan Sharp wrote:
Tony wrote:Analyzing 1 M positions ( not very much) for 10 seconds (not much either) takes over 100 cpudays.

Tony
Huh, you put 100 CPUs to do the job and do it in one day? If a very popular website solicited the help of their users to the task, they'd help and the task would be done faster.
I don't think so. The whole point is that during analyses you have access to the positions you already did. Otherwise your 10 sec will never be better than a 10 sec search of individual positions.
Done this way, you might need 10 times a much positions and 100 times the analyses time and IMO it's still worthless.

The analyses has to be saved as a tree, so that if a program discovers a checkmate at some point, this is minimaxed back and other scores are adjusted as well.

Cheers,

Tony
mrglsmrc
Member
Posts: 3
Joined: Thu Dec 11, 2008 3:05 am
Location: gmt-5, metro nyc

distributed computing and chess

Post by mrglsmrc »

100 cpus doing work together is not 100 times the efficiency of 1 cpu. it is less than that because the more cpus you have running, the more overhead you have and that needs to be managed.
it's been a long time since i looked at the issue but i am pretty sure there comes a point where adding more cpus does not actually improve output.
Huh, you put 100 CPUs to do the job and do it in one day?
User avatar
turbojuice1122
Senior Member
Posts: 2315
Joined: Thu Aug 23, 2007 9:11 pm

Post by turbojuice1122 »

Generally, 100 CPU's working together, without good management, is going to be 10 times more efficient than 1 CPU. :-)
Post Reply