Warning: Creating default object from empty value in /web/af_portail/public_html/components/com_myrssreader/views/rss/view.html.php on line 112

Warning: Creating default object from empty value in /web/af_portail/public_html/components/com_myrssreader/views/rss/view.html.php on line 122
Researchers Publish Findings on Potential New Treatments for Leishmaniasis
Source:  World Community Grid News and Updates
jeudi 8 septembre 2016 19:25

The Drug Search for Leishmaniasis team recently published their findings in the Journal of Computer-Aided Molecular Design. Using World Community Grid's computing power, they have identified several drug compounds which may lead to improved treatments for this neglected and sometimes deadly disease.


Behind the scenes at Berkeley SETI: Part II
Source:  SETI@Home
mardi 6 septembre 2016 22:59

In our latest video, Berkeley SETI Research Center Engineer Dave MacMahon takes us into the server room and shows us some of the equipment that powers the search: https://youtu.be/WcTcQIVrskw

Watch Part I of our interview with Dave at https://youtu.be/IOJ6-_gIyP0


Removing the restriction on running work units
Source:  vLHCathome
mardi 6 septembre 2016 15:13

After experimenting with a number of options, a solution has been found to provide a default limit on the number running work units per host for new volunteers which can be disabled by those who are more experienced.

Once implemented, a new setting will be available in the project preferences where the Max # jobs can be set. In order to provide this, three steps are required.

  • Update the project to support the option
  • Updated everyone's project preferences to set this value to 2 tasks
  • Remove current the project limit



This should be transparent for everyone and only those who wish to receive more tasks will get them once they have changed the value of this new setting. Work on this will start immediately and should be completed within the next few days.




Attributing European Record temperatures of 2014
Source:  Climateprediction.net
mardi 6 septembre 2016 11:30

2014 broke the record for the warmest yearly average temperature in Europe. Attributing how much this was due to anthropogenic (man-made) climate change and how much it was due to natural variability is a challenging question, but one that is important to address. A new study which compares four different attribution methods including weather@home, indicates that each shows a very strong human influence on the European temperatures. However, the extent of this influence depends on a researcher’s definition of the event and the method used.

The resulting paper[1], led by Peter Uhe of the University of Oxford’s Environmental Change Institute and e-Research Centre, in collaboration with researchers at the Royal Netherlands Meteorological Institute, University of Melbourne and Climate Central, compares four event attribution methods to determine how much the record temperatures experienced in Europe in 2014 were due to climate change. The paper, ‘Comparison of Methods: Attributing the 2014 record European temperatures to human influences’ has recently been published in Geophysical Research Letters.

Using the 2014 temperatures as an example, the paper will form a foundation for future attribution studies of this type. It was conducted as part of the World Weather Attribution project run by Climate Central to accelerate the scientific community’s ability to analyse and communicate the possible influence of climate change on extreme-weather events such as storms, floods, heat waves and droughts.

Attributing increases in temperature

An increase in temperature puts strain on society, ecosystems, and infrastructure. So it is important to be able to determine to what extent, if any, man-made climate change has contributed to the frequency of temperature records being broken. Quantifying the impacts in this way helps us to begin assessing the socioeconomic costs of climate change.

The paper notes that record-breaking daily temperatures over Europe have increased in recent decades due to climate change – and record-breaking annual average temperatures, such as those experienced in 2014, are also expected to increase. Taken as a whole, these increasing temperatures have robustly been attributed to increasing anthropogenic greenhouse gas emissions, however attributing individual events to climate change is less straightforward.

When studying individual events there are a number of factors that can alter the findings, such as the assumptions made and the type of model used. Employing multiple methodologies provides a consistency check in the results between each method and leads to greater confidence in attributing human influence. The study contrasts the results of four methodologies, including using weather@home, when applied to the question: how much did anthropogenic climate change alter the likelihood of the European record temperatures of 2014? Each of the methods takes a different approach and involves slightly different assumptions regarding the modelling of anthropogenic influence on climate.

The results

The study examines the ‘risk ratio’, which is defined as the ratio between the probability of the event occurring in the actual world with climate change and the probability of it occurring in the world without human influence. (For example, a risk ratio of 10 means the event is 10 times more likely in the current climate compared to a world without climate change.) The study found that attribution over larger geographical areas tends to give greater risk ratio values. This highlights a major source of sensitivity in attribution statements and the need to define such events for analysis on a case-by-case basis.

Fig2
Risk ratio calculated using the four different methods.

All of the methods showed a risk ratio of at least 500 for temperatures averaged over the whole of Europe – in other words, the extreme temperatures were 500 times more likely in the current climate compared to a world without climate change. The climate modelling techniques representing the world without human influence did not find any simulations as warm as observed in 2014. Statistical modelling based on the observations alone suggested that the gap between two such events in a ‘natural’ world would be likely to be at least 4000 years – compared to around every 30 years in the current climate.

However, as the impacts of most events are felt on a local rather than continental scale, a more meaningful way of looking at the risk ratio is to calculate it for small regions. For example, 2014 was especially warm in central Europe and Scandinavia but cooler in Portugal, Ukraine and the western part of Russia. Using an empirical model the risk return varied between less than 10 in Scandinavia, to between 10 and 100 in Europe, and more than 1,000 in Spain and France. The study showed that calculating the risk return locally gives a very different value compared to calculating it from the average European temperature.

The researchers found that the choice of region for analysis is particularly important, to avoid including effects that are not relevant for a particular study. Using the multi-model approach, the researchers can say with high confidence that the annual mean European temperatures have been made at least 500 times more likely by human-induced climate change – but this is only true for the continent as a whole, and not for individual regions. For an event defined by mean temperature, the exact choice of region can change the quantification of the risk ratio by an order of magnitude.

Resources:

[1] Uhe, P., F. E. L. Otto, K. Haustein, G. J. van Oldenborgh, A. D. King, D. C. H. Wallom, M. R. Allen, and H. Cullen (2016), Comparison of methods: Attributing the 2014 record European temperatures to human influences, Geophys. Res. Lett., 43, doi:10.1002/2016GL069568.

World Weather Attribution Analysis: https://wwa.climatecentral.org/analyses/2014-europe-heat/




Tips on how to gain better performance on the ATLAS_MCORE app
Source:  ATLAS@Home
mardi 6 septembre 2016 03:23

According to our test, the CPU performance (We measure the CPU performance by seconds of CPU per event. For example, the current ATLAS job running on the ATlAS_MCORE app processes 100 events, if the overall CPU time for this job is 30000 seconds, then the CPU performance is 300 seconds/event) varies to the number of cores of the vm.

In the test, we also compared the CPU performance on different Virtualbox versions.

The test result can be seen here:



The above test is done on 2 hosts, HOST1 has HT(HypherThreading) enabled, while HOST2 has HT disabled. The test result is consistent despite of whether disabling/enabling HT of the CPU.

Also, we conclude a result from the ATLAS job statistics based on the jobs from over 1 month period. The following result shows the average CPU performance on different number of cores. (ATLAS_MCORE supports up to 12 cores for now).



The benefit of using more cores in one vm is to save memory usage, but using big number of cores can also significantly reduce the CPU performance. Our test result also concludes this is a case in all cloud computing platforms, not just on ATLAS@home.

In order to have a good tradeoff between memory usage and CPU performance, we advise that you configure the cores to each ATLAS_MCORE job(VM) according to the overall cores and memory allocated to BOINC.For example,if your host allocates 12 cores to BOINC, by default, ATLAS_MCORE creates a vm with 12 cores with 12.1GB memory, but if the host has enough memory, you can customize the usage with the app_config.xml file, i.e. each vm uses 6 cores with 7.3GB memory, so that your host runs 2 vms, and the overall memory usage is 14.6GB.

You can limit the MultiCore-App by using the app_config.xml (This file needs to be put in your project/atlasathome.cern.ch/ directory).

Below is an example to limit each ATLAS_MCORE job to use only 6 Cores:

<app_config> <app_version> <app_name>ATLAS_MCORE</app_name> <avg_ncpus>6.000000</avg_ncpus> <plan_class>vbox_64_mt_mcore</plan_class> <cmdline>--memory_size_mb 7300</cmdline> </app_version> </app_config>


You should change these two lines to your needs:

<avg_ncpus>4.000000</avg_ncpus> <cmdline>--memory_size_mb 7300</cmdline>


Memory usage calculated by the ATLAS_MCORE app is by this formula:

memory = 2500 + (800* NumerOfCores)

so it is 7300MB for 6 cores.

We will also make some changes on the server side very soon:
1. Require a minimum version(5.0.0) of Virtualbox for the ATLAS_MCORE app.
2. Limit the ATLAS_MCORE app to use at most 8 cores.


Project News Sep 4, 2016
Source:  Rosetta@home
dimanche 4 septembre 2016 09:00

We've had an unplanned project crash. The project is back online but you might experience interruptions while the system catches up.... -KEL [Sun Sep 4 07:53:59 PDT 2016]


Volunteer from France Creates an International Team
Source:  World Community Grid News and Updates
vendredi 2 septembre 2016 18:26

When Thomas Dupont discovered World Community Grid, he decided not only to contribute his own computing power, but also to create a team of volunteers that welcomes members from all over the globe. Here is his story in his own words.


ecm xyyxf finally achieved t50
Source:  yoyo@home
vendredi 2 septembre 2016 00:00

The xyyxf project has finally achieved t50 (B1=43e6) on all composites. We found some prime factors or fully factored some composites. Now the project has started to enqueue numbers to partial/full t55 (B1=11e7). Most of these numbers will be ready for SNFS once we complete them. A few will likely complete ECM as soon as this weekend, with a steady stream of NFS ready composites expected thereafter. They will go to NFS@home to get them fully factored.


Server replaced
Source:  yoyo@home
vendredi 2 septembre 2016 00:00

Last month we had 2 crashes of the server. The server just powerd off. We saw no software reason for it. Some capacitors looked strange. So we placed the disks into a new server.


Performance issues with older Virtualbox
Source:  ATLAS@Home
jeudi 1 septembre 2016 10:46

We recently noticed that with the ATLAS Multi Core application, there is a HUGE performance difference between the old and new version virtualbox, and this is especially obvious with big multi core (Core number >4 ).


As shown in this host with 12 Cores allocated to BOINC, the older version of virtualbox (4.2.16) uses almost 3 times of CPU time compared to the newer version(5.1.2) with the same kind of jobs.More results of other hosts from our database confirms this difference. And our database also shows around 7% of hosts still use older virtualbox(version
Also, in a few days, we shall setup a requirement on the server, so that hosts with older virtualbox(version
Test results can be seen here