jump to navigation

8 Policy Issues that Every Physicist Should Follow October 5, 2012

Posted by admin in : Astronomy and Astrophysics (ASTRO), Atomic, Molecular and Optical Physics (AMO), Chemical and Biological Physics (CBP), Condensed Matter and Materials Physics (CMMP), Earth and Planetary Systems Sciences (EPSS), History, Policy and Education (HPE), Medical Physics (MED), Nuclear and Particle Physics (NPP), Photonics and Optics (POP), Physics Education Research (PER), Technology Transfer, Business Development and Entrepreneurism (TBE) , add a comment

#1. Federal Science Budget and Sequestration
The issue of funding for science is always with us.  With few exceptions everyone seems to agree that investment in science, technology and innovation is fundamentally necessary for America’s national and economic security.  Successive Administrations and Congresses have rhetorically praised science, and have declared that federal science agencies, particular NSF, DOE Office of Science and NIH should see their respective budgets doubled.  Where the rhetoric has met with action in the last decade, recent flat-lined budget increases, and the projections for the next decade erode these increases in real terms, and in fact in the next few years the federal R&D budget could regress back to 2002 levels and in several cases to historic lows in terms of real spending power.

What is sequestration?
Last year Congress passed the Budget Control Act with the goal of cutting federal spending by $1.2T relative to the Congressional Budget Office baseline from 2010 over 10 years.  The broad policy issues in the Budget Control Act follow from the fact that the total amount and the rate of growth of the federal public debt is on an unsustainable path.  The Budget Control Act would only reduce the rate of growth but not reduce the debt itself.  The basic choices are to increase taxes and/or to decrease spending.

The Budget Control Act also established the Joint Select Committee on Deficit Reduction, which was to produce a plan to reach the goal.  If the committee did not agree on a plan, the legislation provided for large, automatic – starting in January 2013 (already one quarter through FY13), across-the-board cuts to federal spending.  This is called sequestration.  The committee could not come to an agreement, and as a result the federal government faces what has been termed a ‘fiscal cliff’ where simultaneously several tax provisions will expire (resulting in tax increases) in addition to the sharp spending cuts.  This will most certainly plunge the economy into a recession.

Sequestration would require at least 8% budget cuts immediately in FY13 (the current year).  In the political lexicon on this topic federal spending is divided into defense and non-defense.  The current formula would put somewhat slightly more of the cuts on non-defense programs, but there is talk of putting all burden of sequestration on non-defense programs.  If the burden is borne only by non-defense programs, some agencies could lose as much as 17%.

It is important to emphasize that these would be immediate cuts starting with FY13 budgets, so a $100K grant for this year would suddenly become $92K, or possibly $83K.  Then from the sequestration budgets, the Budget Control Act would require flat budgets for the subsequent 5 years.  While it would generally be up to the agencies to figure out how to distribute the immediate cuts, it is instructive to see how the cuts would impact agencies that are important overall to physics and astronomy research.

How does it impact physics?
The R&D Budget and Policy Program at AAAS has done a masterful job at analyzing sequestration and its impact on science agencies. The cases of DOD and NIH provide some general indications of the effects of sequestration.  DOD is the single largest supporter of R&D amongst the federal agencies, and NIH is the second largest.  Under sequestration they would lose $7B and $2.5B, respectively.  Inside the DOD number is funding for basic and applied science, including DARPA programs.  These accounts would lose a combined $1.5B.  But there is an important dichotomy between DOD and NIH.  IF the Congress and Administration decide to apply the cuts only to non-defense programs, the cuts at NIH would have to be deeper (to meet the overall targets), while the cuts at DOD would remain unchanged.

At NSF, if the cuts are applied truly across the board, $500M would immediately be eliminated from the agency’s FY13 budget.  In a scenario where the cuts are applied only to non-defense spending the NSF cuts could be just over $1B.  It would be as if the NSF budget had regressed back to 2002 levels, basically wiping out a decade of growth.  To further put these cuts into context, NSF’s total FY13 budget request for research and related activities is $5.7B, including $1.345B for the entire Math and Physical Sciences Directorate.  One billion dollars is what the agency spends on major equipment and facilities construction and on education and human resources combined.  It is by far larger than the Faculty Early Career Development and the Graduate Research Fellowship programs.  And put one last way, the cuts would mean at least 2500 fewer grants awarded.

Under the sequestration scenario where defense and non-defense program bear the brunt of cuts equally, the DOE Office of Science could lose $362M immediately in FY13, while NNSA which funds Lawrence Livermore, Los Alamos, and Sandia national labs, would lose at least $300M.  Again these cuts would be deeper if the Congress votes, and the President agrees to subject the cuts only to non-defense programs.  The Office of Science cut is nearly equivalent to the requested FY13 budget for fusion energy research ($398M).  The Office of Science had enjoyed a fair level of support in the past decade, but sequestration would take the agency back to FY08 spending levels or to FY00 if the cuts are applied to non-defense programs only.

NASA would immediately lose at least $763M with the Science Directorate losing nearly $250M.  Again these cuts would be much deeper if distributed only to non-defense programs.  In that scenario NASA would immediately lose $1.7B in FY13, more than the FY13 budget for James Webb Space Telescope ($627M) or the Astrophysics Division ($659M).

What should you do?
In summary, the overall objective of the Budget Control Act is to reduce the federal deficit by $1.2T over the next decade.  This would slow the rate of increase of the overall federal debt.  The Act was resolution of political gamesmanship over raising debt ceiling, which has to be increased from time to time to authorize the federal government to make outlays encumbered in part by prior year obligations.  The sticky issue was taxes.  The GOP, which generally desires more spending cuts than Democrats, was not willing to agree to anything that involved a tax increase.

Besides wanting to preserve more investments in discretionary programs, President Obama was not willing to push too hard on increasing taxes given the weak economy, and probably wanting to avoid the adverse politics of increasing taxes before the election.  Subsequently because the Congress could not agree on a way to produce $1.2T in deficit reduction over 10 years, the law requires sequestration of FY13 budgets, i.e., immediate and draconian cuts (8-17%), the mechanics of which would have serious adverse effects to the entire US economy.

Both before the election and after you should contact the President, your Senators and Representative, and urge them act urgently to steer the federal government away from sequestration and the fiscal cliff.

#2. Timeliness of Appropriations
What is the issue?
The US Constitution requires that “No money shall be drawn from the treasury, but in consequence of appropriations made by law.” Each year the federal budget process begins on the first Tuesday in February when the President sends the Administration’s budget request to Congress.  In a two-step process Congress authorizes programs and top-line budgets; then it specifically appropriates spending authority to the Administration for those programs.  The federal fiscal year begins on October 1st, and when Congress does not complete their two-step process, operations of the federal government are held in limbo.  Essentially the government is not authorized to spend money.  This is overcome by passing “continuing resolutions” that basically continue the government’s programs at the prior year programmatic and obligating authorities.

How does it affect physics?
Continuing resolutions wreak havoc for the Administration, i.e, for funding agencies, and consequently for federal science programs.  They prevent new programs from coming online and the planned shutdown of programs.  Because federal program directors cannot know what their final obligating authority will ultimately be, they have to be very careful with how much they spend.  The consequences of over-spending obligating authority are unpleasant.  Keeping a science program going under the uncertainty of the continuing resolution is hard, and in some cases impossible.

What should you do?
Physicists would be well advised to tune into the status of appropriations for agencies from which they get funding, plan accordingly, and use their voices to pressure Congress to finish the appropriations process by October 1st.

#3. Availability of Critical Materials: Helium, Mo-99 and Minerals
Helium shortage?
Helium is not only an inordinately important substance in physics research, but also in several other industrial and consumer marketplaces.  But despite its natural abundance, it is difficult to make helium available and usable at a reasonable cost.  Usable helium supplies are actually dwindling at a troubling rate, and price fluctuations are having very undesirable effects in scientific research and other sectors.

Most usable helium is produced as a by-product in natural gas production.  Gas fields in the United States have a higher concentration of helium than those found in other countries.  Those facts, combined with decades of recognition of helium’s value to military and space operations, scientific research and industrial processes, Congress enacted legislation to create the Federal Helium Program, which has the largest reserve of available helium in the world.

Enter the policy issues.  In an effort to downsize the government in 1996, Congress enacted legislation to eliminate the helium reserve by 2015 and to privatize helium production.  But the pricing structure required by the 1996 legislation led to price suppression, and thus private companies have been slow to come into the industry as producers, even as demand has been steadily increasing.  So with the federal government’s looming exit from helium production, it does not seem that there is another entity with the capacity to meet the growing demand of helium at a reasonable price.  The few other sources of usable helium available from other countries have nowhere near the US government’s production capacity.

To address this problem Senator Bingaman of New Mexico introduced the Helium Stewardship Act of 2012.  This is a bipartisan bill sponsored by two Democratic and two Republican Senators.  This legislation would authorize operation of the Federal Helium Program beyond 2015.  It would maintain a roughly 15-year supply for federal users, including the holders of research grants.  This should guarantee federal users, including research grant holders, a supply of helium until about 2030.  It would also set conditions for private corporations to more easily enter the helium production business.

But since no action was taken in this Congress, it will have to be reintroduced in January 2013 when the new Congress convenes, and it will have to be taken up in the House after being passed in the Senate.

[Update] On March 20, 2013 the House Natural Resources Committee unanimously approved legislation that would significantly reform how one-half of the nation’s domestic helium supply is managed and sold. H.R. 527, the Responsible Helium Administration and Stewardship Act would maintain the reserve’s operation, require semi-annual helium auctions, and provide access to pipeline infrastructure for pre-approved bidders, in addition to other provisions on matters such as refining and minimum pricing. The bill now moves to the House floor. On the Senate side, Senators Wyden and Murkowski have released a draft of their legislation addressing this issue.

Mo-99 is in short supply too.
There are other critical materials for which Congressional action is pending.  Molybdenum-99 is used to produce technetium-99m, which is used in 30 million medical imaging procedures every year.  But the global supply of molybdenum-99 is not keeping up with the global demand.  There are no production facilities located in the United States, but legislation pending in Congress would authorize funding to establish a DOE program that supports industry and universities in the domestic production of Mo-99 using low enriched uranium.  Highly enriched uranium is exported from the US to support medical isotope production, but this is considered to be a grave global security risk.  The legislation would prohibit exports of highly enriched uranium.

Again this legislation passed the Senate in the last Congress but was not taken up in the House.  It will have to be reintroduced in the next Congress, which convenes in January 2013.  But a technical solution announced by scientists in Canada and another by a team from Los Alamos, Brookhaven and Oak Ridge national laboratories may change the landscape for this particular problem.

Another piece of legislation called the Critical Minerals Policy Act sought to revitalize US supply chain of so-called critical minerals, ranging from rare earth elements, cobalt, thorium and several others.  It was opposed by several environmental groups, and the economics of some mineral markets are attracting some private investment in American sources.

What should you do?
Urge the Senators and Representatives on the relevant committees to reintroduce the Helium Stewardship Act, the Critical Minerals Policy Act as well as legislation that authorizes and appropriates funding for Mo-99 production in the US.

#4. K-12 Education: Common Core Standards and the Next Generation Science Standards
What are the Common Core Standards Initiative and the Next Generation Science Standards?
In 2009 49 states and territories elected to join the Common Core Standards Initiative, a state-led effort to establish a shared set of clear educational standards for English language arts and mathematics.  The initiative is led jointly by the Council of Chief State School Officers and the National Governors Association.  In 2012 the ‘Common Core’ standards were augmented with the Next Generation Science Standards.

How does this affect physics?
The National Research Council released A Framework for K-12 Science Education that focused on the integration of science and engineering practices, crosscutting concepts, and disciplinary core ideas that together constitute rigorous scientific literacy for all students.  The NGSS were developed with this framework in mind.  The goal of the NGSS is to produce students with the capacity to discuss and think critically about science related issues as well asbe well prepared for college-level science courses.

Setting and adopting the Common Core and NGSS are not federal matters.  The federal government has a very small footprint in the overall initiative.  Rather the policy action on adopting these standards will at the state, school district, and maybe even the individual school levels.

What should you do?
Physicists in particular should be collaborative with K-12 teachers and help where appropriate to implement the curriculum strategies that best position students for STEM careers.  Physicist-teacher collaborations are also very necessary to ensure that the content of physical science courses cover the fundamentals but also incorporate the forefront of scientific knowledge.

#5. State Funding for Education
National Science Board signals the problem
The National Science Board, the oversight body of the National Science Foundation, recently released report on the declining support for public universities by the various governors and state legislatures.  According to the report, state support for public research universities fell 20 percent between 2002 and 2010, after accounting for inflation and increased enrollment of about 320,000 students nationally.  In the state of Colorado, the home of JILA, between 2002 and 2010 state support for public universities fell 30 percent.

Public research universities perform the majority of academic science and engineering research that is funded by the federal government, as well as train and educate a disproportionate share of science students.  But government financial support for public universities has been eroding for decades actually.

The issue is not so much the movement of the best students and faculty from public institutions and private institutions.  All institutions of higher education are federally tax-exempt organizations, thus in some sense they all are public institutions.  Rather the issue is support for the infrastructure that supports innovation, economic prosperity, national security, rational thought, liberty and freedom.

How does this impact physics?
In physics we saw the effects of declining support of higher education in Texas, Rhode Island, Tennessee and Florida where physics programs where closed.  In other states budget driven realities have meant physics departments being subsumed by large math or chemistry departments.

What should you do?
Public and private universities will have to find efficiencies and yield to greater scrutiny as they always have.  But physicists will have to stand up and remind their state governors and legislators of their value to institutions of higher education in terms of educating a science-literate populace as well as producing new knowledge and knowledge workers needed for innovation and economic growth.

#6. College Student Enrollment and Retention
Earlier this year the Presidential Council of Science and Technology Advisors released a report entitled Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering and Mathematics.

Economic projections point to a need for approximately 1 million more STEM professionals than the U.S.  will produce at the current rate over the next decade if the country is to retain its historical preeminence in science and technology.  To meet this goal, the United States will need to increase the number of students who receive undergraduate STEM degrees by about 34% annually over current rates.  Currently the United States graduates about 300,000 bachelor and associate degrees in STEM fields annually.

The problem is low retention rates for STEM students
Fewer than 40% of students who enter college intending to major in a STEM field complete a STEM degree.  Increasing the retention of STEM majors from 40% to 50% would, alone, generate three quarters of the targeted 1 million additional STEM degrees over the next decade.  The PCAST report focuses much on retention.  It proposes five “overarching recommendations to transform undergraduate STEM education during the transition from high school to college” and during the first two undergraduate years, (1) catalyze widespread adoption of empirically validated teaching practices, (2) advocate and provide support for replacing standard laboratory courses with discovery-based research courses, (3) launch a national experiment in postsecondary mathematics education to address the mathematics preparation gap, (4) encourage partnerships among stakeholders to diversify pathways to STEM careers, and (5) create a Presidential Council on STEM Education with leadership from the academic and business communities to provide strategic leadership for transformative and sustainable change in STEM undergraduate education.

How is physics impacted?
The New Physics Faculty Workshops put on by APS and AAPT were mentioned in the report for changing the participants’ teaching methods and having had positive effects on student achievement and engagement.  The report also explicitly calls for NSF to create a “STEM Institutional Transformation Awards” competitive grants program.  But the delegation that met with the Texas Board of Higher Education was confronted with student retention data in physics compared to other STEM fields, and was

This all ties together with federal budgets for STEM education and research, and to the issue of state support for public education.  The lesson from Texas in particular is that physics must do a better job of retaining students in the major or face relative extinction in the academe.

What should you do?
PCAST would say engage your students to excel.  Everyone involved in physics instruction should continually assess their teaching methods and student outcomes.  Every thing from textbooks and labs used to the social environment of the department should be on the table for improvement.

#7. Attacks on Political Science and Other Social Sciences
When science is politicized, caricatured and ridiculed we all lose
In May 2012 the US House of Representatives voted to eliminate the political science program at the National Science Foundation.  The effort was spearheaded by Arizona Republican Jeff Flake.

Congressman, now Senator, Flake was ostensibly concerned about Federal spending and wants to make the point there are some government programs that we must learn to do without.  But the concern for scientists is the approach of singling out individual projects and programs and subjecting them to ridicule only based on their titles.  This rhetorical and political device is used quite a bit, even in biomedical science.  And when it is, it diminishes science everywhere.

More recently, Representative Cantor and others have spoken out against funding social science research, targeting specifically political science research by saying that taxpayers should not fund research on “politics”.  It is important to understand the difference between political science and politics.  Political science research is necessary knowledge for citizens to enjoy the fullness of freedom.  Moreover political science research is especially a hedge against tyranny and deception by politicians.

Attacks on NSF funding of the social science are not new.  NSF funding for the social sciences was slated to be zeroed out during the Reagan administration.  One result was a spirited defense of the importance of such work by the National Science Board that appeared in its annual report provocatively titled, “Only One Science.”  The Board was then chaired by Lewis Branscomb, a distinguished physicist, who led the effort to build the case for the social sciences.

Physicists today need to channel Dr. Branscomb and be more learned and active on policy matters.  Particle physics, astronomy and cosmology are not immune from the same kind of attacks being waged against political science.   There are of course many tales of even the most esoteric results of physics research from yesterday having an profound impact in our economy today.  Generally it seems politicians judge the utility of a funded research project from the project name or maybe its brief project summary.  That in itself tends to ridicule science and scientists in ways that are quite destructive.   So all scientists should advocate for intellectual inquiry and its innate public benefits.  Golden Fleece attacks against science may focus on genetic analysis in Drosophila melanogaster one day, political dynamics in a small foreign country another day, but it could be cold atoms on an optical lattice the next.

[UPDATE] On March 20, 2013 the bill to fund the government for the rest of FY13 passed the Senate contained an amendment to bar NSF from funding political science research unless the director can certify that the research would promote “the national security or economic interests of the United States.”  The House passed the same bill the next day.  President Obama is expected to sign it.  So for the next few months at least certain political scientists may be frozen out of NSF funding.

The Colburn amendment probably could not have made it through in regular order, i.e., the normal process of budget legislating consisting of the President’s request, Congressional authorization followed by appropriation, and final action by the President.   But in a situation where time becomes a critical element, and there is “must-pass” legislation actively under consideration, these things can happen.  This underscores the need for political knowledge and information, as well as vigilant, persistent and nimble activism.

What should you do?

The bill eliminating NSF’s political science program has only passed the House.  It was never taken up in the Senate.  But in 2011 Oklahoma Senator Tom Coburn advocated for the elimination of the entire NSF Social, Behavioral and Economics Directorate.  If either measure was to become law it would have to be reintroduced in the next Congress.  Physicists should stay abreast of attacks on other intellectual disciplines, because one day those attacks will be directed at physics and astronomy research.

[Update March 27, 2013]  Political scientists suffered a setback in the continuing resolution for FY-13.  Both the House and Senate approved an amendment offered by Senator Coburn that would bar NSF from awarding any grants in political science unless the director can certify that the research would promote “the national security or economic interests of the United States.” The political science programs at NSF have a combined budget of $13 million. The legislation requires the NSF director to move the uncertified amount to other programs. President Barack Obama as signed the legislation. This kind of action against social science research is not new, but this is the first time in a long while that such a measure actually has become law.

Given the exact wording of the Coburn amendment, it is only valid until September 30, 2013, when the continuing resolution expires.  As a distinct point of lawmaking it may or may not survive the regular order of budgeting, authorizing and appropriating.

#8. Open Access to Research Literature
There is much public concern about having access to the output (manifest as journal articles) from publicly funded research.  And scientists worldwide are of course very concerned about rising journals subscription prices.

Last December the Research Works Act (RWA) was introduced in the U.S.  Congress.  The bill contains provisions to prohibit open-access mandates for federally funded research, and severely restrict the sharing of scientific data.  Had it passed it would have gutted the NIH Public Access Policy.  Many scientists considered the RWA antithetical to the principle of openness and free information flow in science.  Perhaps owing to much public outcry, the proposed legislation was abandoned by its original sponsors.

The United Kingdom and the EU have just adopted a policy where all research papers from government funded research will be open-access to the public.  To support this policy financing for journals will sourced from author payments instead of subscriber payments.  This is a major change that will require much transition in marketing, management and finance.

Open-access policy should balance the interests of the public, the practitioners of the scholarly field, as well as commercial and professional association publishers that add value to the process of communicating and archiving research results.  Scholarly publishing is a complex, dynamic and global marketplace.  It is not likely that one solution will be satisfactory for all consumers and producers (which in this marketplace are sometimes one in the same).  New business models, new communication strategies and realizations what the true demand for scholarly articles will likely be more helpful than precipitous government action.

Members discuss the Higgs discovery July 6, 2012

Posted by admin in : Cosmology, Gravitation, and Relativity (CGR), Nuclear and Particle Physics (NPP) , 2comments

This is certainly an exciting development at CERN! My group and I are in the four lepton working group.  We apply a multivariate analysis to data and simulations to arrive at our results, which is in agreement and supports what was shown by Fabiola.

We see excesses in the gamma-gamma and four lepton channel, but not the bbar or WW channels at least in the 2011 dataset.  It may be just statistics, or the way the data is analyzed in the other channels, but the branching ratios for a Higgs boson are predicted with great confidence (only the mass is a free parameter in the SM).  The WW and bbar and tau channels have large bf, larger than gamma-gamma and 4 lepton.  The former channels have larger backgrounds and it is harder to tease out any excess.  So that may explain this question (in my mind at least it is a bit puzzling).  Also, the gamma-gamma channel can have heavy states contributing to the process that signals new physics beyond even the Higgs, if this result holds up.  And note that we don’t yet have enough data to determine the intrinsic spin or parity of whatever particle may be attributed to this excess.

What this all means is that, in my opinion, we will need to wait until more data is collected before a definitive statement can be made about a Higgs or not. Now the real work begins.  What is this new particle?  Is it the Standard Model Higgs boson?  Is it one of several new states? Is it a scalar or pseudoscalar?  Etc. Etc.  Very exciting times!

Professor O. Keith Baker, Yale University

SUSY among all the other ideas out there (extra-dimensions, branes, etc.) is the unique one that is brought to the fore by the light mass Higgs boson that seems just around the corner from having a final discovery announcement. None of the other candidates for what comes after the Higgs discovery have any such implications to my knowledge.

SUSY is also he only one that is brought to the fore if multiple Higgs bosons are ultimately found. SUSY actually requires multiple Higgs bosons and their superpartners.

The SUSY extension of the Standard Model that has been most extensively studied is called the `Minimal SUSY Standard Model’ or MSSM. As there are literally scores and scores of undetermined parameters, it is not definitive about the likely mass hierarchy of the Higgs family. There is also the NMSSM (the `next to mimimal supersymmetrical Standard Model) which has even more parameters and thus is even less definitive about the properties of any Higgs families. In fact, Superstring/M-Theory suggests that even the NMSSM is not the complete story.

There is a Minimal Supersymmetric Standard Model wiki page that has a pretty good discussion of its properties. There can be found here a discussion of the need for why at least a second Higgs boson must exist in the context of the MSSM. Also there is this lecture available on You Tube.

While the MSSM is not so predictive about the masses, it does make very definite predictions about the charges and coupling to the electroweak forces by members of the Higgs family.

Professor S. James Gates, University of Maryland-College Park

As a theorist I’d say that SUSY has “nice” properties of stabilizing the vacuum, but it also restricts the theoretical hand from just adding anything to the theory arbitrarily. For example one might ask where does the SM Lagrangian (without SUSY) for the Higgs come from (it’s just a polynomial interaction) as it does not seem to be based on a principle like the gauge principle that is used in other interactions or SUSY. The answer is that “well it works” to give spontaneously broken symmetry but there may be many ways to get into this spontaneously broken phase theoretically. Gauge theories and SUSY can control, through symmetry, what the interactions look like which also forces certain particles for consistency. That is why SUSY needs more than one Higgs for example. This makes SUSY, in some sense (limited to our imaginative ways to use SUSY) easier to rule out if Nature has no need for SUSY. But even as SUSY starts to experimentally manifest itself (and I believe it will soon), the next big question is “what breaks SUSY?”.

Professor Vincent Rodgers, University of Iowa

Even if this thing is the Higgs, this discovery itself cannot be the only new physics.  The hierarchy problem in physics (the divergence of the Higgs mass at much higher energies) requires something that stabilizes it.  The most likely candidate, as far as I can tell, is SUSY, just as Professor Gates wrote.

In SUSY, there are an additional four Higgs bosons, at least in the SUSY models I am familiar with.  So Rolf’s statement about which Higgs refers to this dilemma.  Is this thing reported on yesterday the SM Higgs, or one of the SUSY Higgs? If the latter, then there should be more waiting to be found.

SUSY is not the only solution to the hierarchy problem.  But it is probably the most developed theoretically.  In my opinion, the simplest SUSY models seem to be ruled out over much of its parameter space by some precision experiments like edm expts.  Jim may want to clarify this for me.  So if SUSY exists, it is likely some of the MSSM models, or beyond.  Also, as I understand the theory, a heavy Higgs comes into tension with SUSY.  If this 125-126 GeV thing is a Higgs, it is in a difficult but doable region for MSSM of some sort.

Professor O. Keith Baker, Yale University

The whole event was really thrilling, and I was especially glad to see the payoff from our efforts to enhance and better model the ATLAS detector’s performance in intense luminosity conditions. This demonstrates that we are ready not only for discoveries, but also for the following studies to more conclusively identify this new boson.

I concur with our theorists that even if this is a Higgs discovery, our job of explaining how the SM works so well in this energy regime will be far from finished — a lot of my recent work at ATLAS has been related to this area of SUSY and other “exotic model” searches.

But, like Keith, I am especially interested in the couplings of this new particle to third-generation fermions, where the little data ATLAS and CMS have — and it’s far too little for me to place bets yet — do leave room for a lot of surprises to come in.

The implication for hadron collider physicists of my generation — the ones too late to discover the top quark, who relentlessly probed it at the Tevatron and LHC to check for any deviations from SM predictions — is that a new space has opened up for similar tests. Once again, we can easily envision likely ways to make significant contributions to our understanding of the particle universe (hurray!).

Professor Ayana Arce, Duke University

The discovery of the Higgs-like particle is the culmination of a lot of efforts for many years by so many people. I started on the ATLAS Experiment in 1998, I contributed to various aspects of the ATLAS experiments and held many positions in the ATLAS Collaboration. I was ATLAS Higgs working group convener in 2008-2010. In this capacity, I led and directed the analysis efforts of the ATLAS Higgs working group. So, I can confidently say that my work contributed directly in the search and discovery of this new particle. It is a significant achievement that will lead to the capacity building and training for younger students, an improvement understanding of fundamental physics, and ultimately technological spin-offs to the benefits of humanity. It is truly a great pleasure for me to work with so many people across the world and to participate directly in such a monumental discovery that may revolutionize our lives in the years to come.”

Dr. Ketevi A. Assamagan, Permanent Staff Physicist, Brookhaven National Laboratory

At last – there is exciting and long awaited news of a new Higgs-like boson. South Africans scientists, students and computer experts have participated in these exciting developments. “It’s a global experiment, and we have six of our Universities participating at CERN” says Prof Jean Cleymans, leader of the SA-CERN programme, which launched almost four years ago.

The Department of Science and Technology selected CERN as one of its global large-scale infrastructure projects; it supports scientists in the South Africa-CERN consortium to participate in experiments to investigate the existence of the Higgs boson particle and other expected discoveries. The Department is proud of these scientists who are part of this major scientific breakthrough and celebrates this achievement with the rest of the world.

Tantalizing hints of a new particle with a mass around 126 GeV were reported in December 2011. ATLAS and CMS, two of the CERN experiments, have today not only confirmed these hints with data taken in 2012, but also done so with sufficient confidence (5 sigma each) to claim a new particle has been observed. A 5 sigma confidence means that the error due to statistical fluctuations has a probability of less than 1 in 1.7 million (or rolling a dice eight times to get a six each time). Furthermore, the new particle interacts similarly to the Higgs boson. The Higgs boson is reputed to endow mass to other particles. This new Higgs-like boson will now be subjected to intense and detailed study, over some decades, and while exploring this, we may make further surprising discoveries.

Although we don’t have a crystal ball to predict the full benefits to science and society, we note that most of today’s understanding of nature and the development of technology began with the discovery of the now familiar particles like the electron. We are at a new beginning. The LHC may also shed light on the primordial state of matter, shortly after the Big Bang, and on dark matter and dark energy.

The LHC at CERN is a global experiment, and South African participation at CERN enables the highest quality scientific research, manpower development, technology transfer and innovation. The South African computing Grid was established as a result of the CERN involvement. This is a combination of fast networks and high performance computing clusters. It forms the basis of data processing and analysis for CERN. It will also provide valuable lessons for the SKA and data intensive computing in general. Other spinoffs are expected in diagnostic and therapeutic medicine, remote sensing and nuclear technology, to name a few other fields.

Statement by SA-CERN Programme

Lessons learned (so far) from the superluminal neutrino episode April 7, 2012

Posted by admin in : Astronomy and Astrophysics (ASTRO), Cosmology, Gravitation, and Relativity (CGR), Nuclear and Particle Physics (NPP) , add a comment

Reprinted from Waves and Packets, April 7,2012 edition

With the March 15 paper of the ICARUS group claiming no advance effect for their (seven) neutrino events, it seems the urgency and interest in this matter is dwindling. OPERA spokesperson Antonio Ereditato and experimental coordinator Dario Autiero have announced their resignations, following a controversial vote of “no confidence” from the collaboration’s other leaders. Waves and Packets has asked three distinguished physicists what they think the lessons learned are from the entire episode.

“It is misconception that Einstein’s special theory of relativity says that nothing can travel faster than the speed of light. For example, electrons can travel faster than the speed of light in water. This leads to a phenomena known as Cherenkov radiation which is seen as a blue glow in nuclear reactors. In addition, for a long time it’s been speculated that subatomic particles known as a tachyons might exist. Tachyons are theoretically predicted particles that travel faster than the speed of light in a vacuum and are consistent with Einstein’s theory of relativity. For ordinary subliminal particles light acts as a barrier from above. That is ordinary matter cannot be accelerated to the speed of light. For superluminal tachyons light acts as a barrier from below. That is to say that tachyons cannot be decelerated to the speed of light. It has been conjectured that tachyons could be used to send signals back in time. To date tachyons have not been observed experimentally.” Ronald Mallett, University of Connecticut-Storrs

“I think the first thing the whole episode indicates is that there is still enormous public interest in our field. The need to explore is still felt keenly so we need to be clear that announcing results, even controversial ones, should be respected by scientists if proper peer review of those results has been performed. It also points out the absolute necessity of following through on external checks. Public review of the scientific process is not a bad thing nor is showing some humility and skepticism even about ‘sacred’ principles like special relativity. Episodes like this one give us the opportunity to address misconceptions like those surrounding the connection between special relativity and the speed of light. Showing fallibility doesn’t weaken us as long as we remain appropriate demanding of ‘extraordinary proof’ for “extraordinary results.” Larry Gladney, University of Pennsylvania

“I can think of two positive remarks to be made. The first is that, given an information leak from someone familiar with the OPERA experiment to Science magazine, the OPERA Collaboration did the right thing in going public with the information they had at hand. In the spirit of good science, they nearly begged other experiments to validate or invalidate their working hypothesis of superluminal neutrinos. It now appears that invalidation was in order, as reported by the ICARUS experiment. Over the next several months, we may anticipate half a dozen experiments on three continents providing further measurements of neutrino speed; new data will also be forthcoming from the OPERA and ICARUS experiments. My second positive remark is that many of us have been pushed by the OPERA claim to examine the deeper meaning of Special and General Relativity. While paradoxes, such as superluminal travel with inherent negation of cause and effect, are mathematically consistent with Einstein’s equations, they generally are hidden behind horizons, or require invocation of new physics such as negative energy, extra dimensions, sterile neutrinos, etc. It has been fun and educational to think about the possibilities. Any opportunity to explore a guarded secret of Nature must be seized upon. It unfortunately appears now that superluminal neutrino travel may not be one of Her guarded secrets.” Thomas Weiler, Vanderbilt University

What’s your view? Contact Waves and Packets at editors@wavesandpackets.org.

Synchrotron Science on the Move in South Africa February 4, 2012

Posted by International.Chair in : Nuclear and Particle Physics (NPP) , add a comment

By Sekazi K. Mtingwa
MIT and African Laser Centre
Consultant to Brookhaven National Laboratory

Excitement is growing within South Africa’s synchrotron light source user community. That excitement led to a two-day workshop, held December 1-2, 2011, in Pretoria to finalize plans for the drafting of a strategic plan document to be submitted to the government’s Department of Science and Technology (DST), which is broadly responsible for science and technology in the country, and the National Research Foundation (NRF), which is responsible for the distribution of research funding similar to what the National Science Foundation does in the United States. Top officials from those agencies attended the workshop, including Romilla Maharaj, NRF Executive Director of Human and Institutional Capacity Development; Rakeshnie Ramoutar, NRF Program Director of Strategic Platforms; and Takalani Nemaungani, DST Director of Global Projects. Daniel Adams, Chief Director: Emerging Research Areas & Infrastructure at the DST, provided funding for the workshop and the South African Institute of Physics (SAIP), which is similar to our American Physical Society, handled the logistics.

The entity that mainly drove the convening of the workshop was the Synchrotron Research Roadmap Implementation Committee (SRRIC), which is chaired by Tshepo Ntsoane from the South African Nuclear Energy Corporation (NECSA) and co-chaired by Wolf-Dieter Schubert from the University of the Western Cape.

Approximately forty scientists attended the meeting, including those from international facilities. Herman Winick of SLAC and Sekazi Mtingwa of MIT attended, and Brookhaven National Laboratory’s Erik Johnson and Ken Evans-Lutterodt joined via teleconferencing. Johnson and Evans-Lutterodt discussed the pros and cons of South Africa’s inheriting Brookhaven’s second generation light source called the National Synchrotron Light Source, which is soon to be replaced by NSLS II. However, the consensus of the workshop was that a new third generation facility would much better serve national and regional needs. The largest contingent of foreign visitors were from the various European light sources, including José Baruchel, Jürgen Härtwig, and the Laboratory Director General, Francesco Sette, from the European Synchrotron Radiation Facility (ESRF) in Grenoble, France; Jasper Plaisier from Elettra in Trieste, Italy; Trevor Rayment from Diamond in Oxfordshire, UK; and Hermann Franz from Petra III in Hamburg, Germany. Oxford University’s Angus Kirkland did an outstanding job of facilitating the two-day meeting.

South Africa is relatively new to the international community of synchrotron light source users. Simon Connell, of the University of Johannesburg, has documented the history of South African scientists’ usage of synchrotron radiation. The first were Trevor Derry and Jacques Pierre Friederich “Friedel” Sellschop (deceased), both from the University of the Witwatersrand (Wits). In 1994, Derry performed studies of diamond surfaces at both the Synchrotron Radiation Source-Daresbury Laboratory and ESRF. During the same year, Sellschop participated in other diamond studies at ESRF. Then in 1996, Giovanni Hearne, currently at the University of Johannesburg, used the facility at ESRF to study materials under extreme pressures. Bryan Doyle, now at the University of Johannesburg, served as a postdoctoral researcher at ESRF around 1999. From those early efforts, the synchrotron light source user community started to grow.

Hearne’s early experiences at ESRF so excited him that, upon returning to South Africa, he wrote a two-page letter to Khotso Mokhele, then President of the Foundation for Research Development (now the National Research Foundation), to share those experiences and impress upon him that a synchrotron light source is a key single tool that could have wide impact across many scientific disciplines. Moreover, Hearne suggested that a long-term goal should be for South Africa to construct its own light source via a consortium of international partners, especially involving neighboring countries in Southern Africa.

In 2002, at the urging of the Edward Bouchet-Abdus Salam Institute (EBASI), which is an organization based at the International Centre for Theoretical Physics (ICTP) in Trieste that promotes African – African American collaborations, the African Laser Centre included the design and construction of a synchrotron light source as a long-term goal in its Strategy and Business Plan. Next, Tony Joel and Gabriel Nothnagel of NECSA co-authored a motivational paper entitled, The South African Light Source: Proposal for a Feasibility Study for the Establishment of an African Synchrotron Radiation Facility (2003), followed by Tony Joel’s paper, The South African Synchrotron Initiative: The South African Light Source: A Synchrotron for Africa – Strategic Plan (2004). On another front, in 2004, the DST/NRF/SAIP commissioned an international panel of experts that released the report, Shaping the Future of Physics in South Africa, which called for consideration of new flagship projects to complement those in astronomy, such as the South African Large Telescope (SALT) and the Square Kilometre Array (SKA). They used a synchrotron light source as a prime example of such a project. Key members of that panel from the U.S. were Ken Evans-Lutterodt, S. James Gates from the University of Maryland-College Park, and Guebre Tessema from the National Science Foundation.

The first organizational structure for a synchrotron science community took shape in 2003, when a committee of synchrotron users established the South African Synchrotron Initiative (SASI). Van Zyl de Villiers of NECSA played a key role in getting DST’s participation in SASI activities. The leadership of SASI mainly consisted of Tony Joel; Simon Connell; Giovanni Hearne; and Lowry Conradie, an accelerator physicist from South Africa’s national accelerator center called iThemba LABS, located just outside of 3 Cape Town. As a result of its participation with SASI, in January 2005, the DST itself assumed a leading role in building the synchrotron science community by forming the Synchrotron Task Team (STT), with Tshepo Seekoe of the DST serving as Chair and Simon Connell leading the development of the science case. It was during this period that the synchrotron science community began to mobilize as a coherent group.

With the assistance of SOLEIL, ESRF and other organizations, the STT organized the first two of a series of roughly biennial Science @ Synchrotrons Conferences (S@S) in November 2005 and February 2007. Both conferences were extremely successful in developing new projects and sparking the interest of students in synchrotron light source training. Members of the U.S. physics community, including Herman Winick, Alfred Msezane of Clark Atlanta University, and Sekazi Mtingwa, participated in planning and giving presentations at those conferences, which helped to establish a close partnership between South African synchrotron users and their foreign colleagues, especially the French. After the second conference in 2007, the synchrotron community further empowered itself with the establishment of SRRIC, which succeeded the STT in championing synchrotron science in South Africa. The first Chairs of SRRIC were Simon Connell and Giovanni Hearne. Following the S@S conference in February 2009, Brian Doyle assumed the Chair, followed by Tshepo Ntsoane.

All the above-mentioned activities culminated in the excitement that birthed the December 2011 Strategic Plan Workshop. The NRF representatives requested that SRRIC document the outputs of the workshop by March 2012 in the form of a white paper strategic plan. Then it would study the white paper to determine if it would give the go-ahead for the development of a detailed business plan by June 2012. Those dates were selected to coincide with the dates of the various stages of the government’s budgeting process. SSRIC appointed a three-person committee to write the strategic plan, consisting of Brian Masara, Executive Officer of SAIP; Douglas Sanyahumbi, Director of the Technology Transfer Office at the University of the Western Cape; and Sekazi Mtingwa, with the latter chairing the committee.

Although the strategic plan has not been completed, there are some overarching comments that can be made. First, there is widespread agreement that the mission of SRRIC going forward will be as follows: To support and facilitate the development and growth of synchrotron science in South Africa in order to ensure that it contributes to excellence in science, innovation and industrial development by exploiting the benefits of synchrotron radiation in advancing fundamental and applied science through

1. Developing human capital, including attracting back the African scientific Diaspora (brain gain) and mitigating any threat of brain drain of young South 4 Africans who have recognized this as a key research tool for their career development;
2. Developing key and/or strategic international collaborations;
3. Ensuring financial support to South Africans whose proposals successfully compete for beam-time at international synchrotron facilities; and
4. Promoting awareness and use of synchrotron science and its capacity to enable the exploration of new frontiers of technology.

In pursuing this mission, the synchrotron science community and the government must undertake a number of key initiatives, including

1. Deciding at what level it should formalize its relationships with foreign light source facilities, especially with ESRF, which is the most heavily used by South African researchers; (Francesco Sette invited South Africa to join ESRF as a Scientific Associate at the 1% level, since its researchers’ utilization of that facility is already approximately at that level.)
2. Studying the feasibility of constructing South African or multinational beam-lines at foreign synchrotron facilities;
3. Promoting a significant growth in the number of synchrotron users, with a heavy emphasis on increasing the number of students being trained, such as at the many synchrotron radiation schools that are offered at a number of international facilities and institutions, such as ICTP;
4. Developing programs to preserve and expand the existing technical expertise, such as sending scientists and engineers abroad to join accelerator teams at foreign facilities to expand capabilities in areas such as ultra-high vacuum systems, radiofrequency cavities, magnets, power supplies, and controls;
5. Improving the local, critical feeder infrastructure that allows researchers to prepare and analyze samples before and after being shipped for studies at foreign synchrotron facilities
6. Promoting greater involvement of industrial users;
7. Studying the feasibility for constructing a third generation light source;
8. Developing mechanisms to educate the public about the revolutions in science and technology, such as the discovery of new pharmaceuticals, that synchrotrons afford.

The figure appended provides a plot of South Africa’s synchrotron light source usage in terms of the number of users, beam-line shifts, graduate students trained, and visits to synchrotron facilities. The data represent a rough approximation, based on preliminary surveys; however, note that the 2011 data represent only part of the year, since 2011 had not ended by the time of the workshop. According to the data, the number of students trained at foreign facilities has increased from six (6) in 2005 to thirteen (13) in 2011, thus showing a growth in human capital, especially over the past three years. The long 5 distances and substantial travel expenses are major factors that impede the increase in the number of students being trained. A local facility would be most advantageous to address this need.

Synchrotron Usage in South Africa

Among the workshop presentations, two were especially notable, since they involved applications of synchrotron light source techniques to disciplines for which many are not aware. One involved research in paleontology, for which Kristian Carlson from Wits discussed his collaboration with Lee Berger, also from Wits, and Paul Tafforeau from ESRF. Among other things, they perform dating and craniodental investigations of the possible human ancestor, Australopithecus sediba, which is the much-publicized fossil remains that Berger’s nine-year-old son, Matthew, discovered in 2008 while assisting his father in field work. In a presentation involving light source applications to heritage science, Leon “Jake” Jacobson from the McGregor Museum (Kimberly), discussed his applications of light sources to study rock art, namely ancient paintings on stones. He investigates such issues as the composition of the paints and how their interactions with rock substrates contribute to the art’s conservation. There is increasing worldwide interest in the use of synchrotron radiation in art and archaeology.

Finally, it is notable that Esna du Plessis and Bruce Anderson attended the workshop to represent the oil and gas company, Sasol Technology. They reported on their use of synchrotron radiation in pursuing extended X-ray absorption fine structure techniques for the study of H2, CO and synthetic gas activation of nano iron. They also made a strong case for a local source to enable more industrial use of light sources.

In conclusion, the momentum is building rapidly within the South African synchrotron science community. SRRIC, as its representative, is committed to maintaining, and indeed intensifying, that momentum. Based upon the Strategic Plan that summarizes the outputs of the December 2011 workshop, SRRIC is looking forward to a favorable decision from DST/NRF requesting it to proceed to the development of a detailed Business Plan by June 2012 in order to move synchrotron science in South Africa to the next level of international prominence.

January 30, 2012

This article is also published in the Spring 2012 Newsletter of the Forum on International Physics of the American Physical Society.

Morgan State University Student Spends Summer at CERN July 24, 2011

Posted by admin in : History, Policy and Education (HPE), Nuclear and Particle Physics (NPP) , add a comment
Eric Michael Seabron, a junior physics major and Morgan honor student with a 3.66 grade point average was selected to join an exclusive 18-member U.S. physics team for a 10-week summer internship at CERN (European Organization for Nuclear Research) in Geneva, Switzerland. 
“This internship is one of the most competitive internships an undergraduate student of physics can compete for in the United States.  Mr. Seabron will benefit from this experience by expanding both his knowledge of physics and participating in the greatest scientific experiment ever proposed, the Large Hadron Collider (LHC). Participation in this internship increases his visibility as a up-and-coming young physicist, and his opportunities for getting into a Tier-1 physics graduate program with schools like Michigan, Harvard, Stanford and Princeton to name a few,” says Dr. Keith Jackson, chair of Morgan’s physics department.

Mr. Seabron is a member of the University of Michigan’s ATLAS team sponsored by a National Science Foundation research grant for undergraduates to work on a valuable piece of equipment (Large Hadron Collider) on the ATLAS experiment. ATLAS (A Toroidal LHC ApparatuS) is one of the six particle detector experiments constructed at the LHC. He and other student colleagues will assist in the commissioning of ATLAS EE detectors, analyze event data to create R-T curves and Muon Spectrometer graphs.

Since 2009, more than 2900 scientists and engineers from 172 institutions in 37 countries have worked on the ATLAS experiment. 

The ATLAS experiment’s primary objective is to detect particles created after high-energy proton on proton collisions.  ATLAS will allow us to learn about the basic forces that have shaped our Universe since the beginning of time (if time has a beginning) and that will determine its fate. Research at ATLAS will provide answers to some of the most basic questions in physics such as the origin of mass, proof of existence of multiple dimensions, unification of fundamental forces, and evidence for dark matter candidates in the Universe. ATLAS brings experimental physics into new territory. Most exciting is the completely unknown surprise – new processes and particles that would change our understanding of energy and matter.

“Students who are successful strive to do more than meet the minimum level of academic performance. If they take this attitude toward their undergraduate education they will find a plethora of new experiences, challenges and opportunities waiting for them, like Mr. Seabron,” says Dr. Jackson.  


Eric is standing holding ladder with Michigan teammate Kareem Hegazy (on ladder) in front of 20 ft. battery cells.

Aspen Center for Physics – 2011 Summer Program January 27, 2011

Posted by admin in : Astronomy and Astrophysics (ASTRO), Cosmology, Gravitation, and Relativity (CGR), Nuclear and Particle Physics (NPP) , add a comment

The annual physics-astrophysics program at the Aspen Center for Physics will be held from May 22 to September 11, 2011. The Center provides a place for physicists and astrophysicists to work on their research with minimal distraction in a stimulating atmosphere, and in a location of great natural beauty.

Applications are welcome from any physicist or astrophysicist who has a serious program of research to be carried out at the Center. The Aspen Center for Physics is committed to a significant participation of women and under-represented groups in all of the Center’s programs.

Individual Research:

The main Center program is unstructured and concentrates on individual research and the informal exchange of ideas. About 500 physicists and astrophysicists from about 100 institutions participate in the Center’s summer program, with 80-90 in residence at any time. (About 40% of the participants in the 2010 program attended for the first time.) The research interests of the participants cover a number of fields, including astrophysics, biophysics, condensed matter physics, dynamical systems, elementary particle physics, mathematical physics, and statistical physics. The interactions between participants with different interests and backgrounds are one of the most stimulating aspects of the program. Applicants can be sure that colleagues from all subfields of physics will be present throughout the summer.


The Center provides a location where physicists from distant institutions can meet for intensive research collaboration. Small informal collaborations of 2-6 physicists are encouraged and efforts will be made to accomodate people wishing to work together.


Equally important to the Aspen Summer Program are the informal workshops that serve as focal points on topics of current interest. Workshops are very informal, with an extremely limited number of talks so that participants have ample time for informal discussion and to initiate new work. The informal workshops scheduled for summer 2011 are:

Quantum Information in Quantum Gravity and Condensed-Matter Physics May 22 to June 5
Galaxy and Central Black Hole Coevolution: Gravitational-Wave and Multi-Messenger Astronomy May 22 to June 5
Fluctuations and Response in Granular Materials May 22 to June 12
Few- and Many-Body Physics in Cold Quantum Gases Near Resonances June 5 to June 26
Stellar and Intermediate Mass Black Holes: Gravitational Physics and Radiation Sources Across the Universe June 5 to June 26
Computation and Collective Behavior in Biological Systems June 12 to July 3
Year One of the LHC June 26 to July 24
A New Century of Superconductivity: Iron Pnictides and Beyond June 26 to July 24
Holography and Singularities in String Theory and Quantum Gravity July 24 to Aug. 21
New Topological States of Quantum Matter July 24 to Aug. 21
A Theoretical and Experimental Vision for Direct and Indirect Dark Matter Detection Aug. 14 to Sept.11
Flavor Origins Aug. 21 to Sept.11
The Galactic Bulge and Bar Aug. 21 to Sept.11

Cosmology on the Beach! September 1, 2010

Posted by CGR Section Chair in : Astronomy and Astrophysics (ASTRO), Cosmology, Gravitation, and Relativity (CGR), Mathematical and Computational Physics (MCP), Nuclear and Particle Physics (NPP) , add a comment

Applications are now open for the Essential Cosmology for the Next Generation (aka Cosmology on the Beach) winter school/research conference! The organizers strongly encourage a diverse group of advanced graduate students and postdoc to participate. Instructors include NSBP member Edmund Bertschinger of MIT’s Department of Physics. Here is the full announcement:

(also known as Cosmology on the Beach)

January 10−14, 2011 in Puerto Vallarta, Mexico

The Conference website and Participant Application form is now available at the Berkeley Center for Cosmological Physics website.

This meeting is the 3rd annual edition, following the very successful and popular 2009 and 2010 conferences. It is a combination of winter school and research conference, with course lectures, blended with recent research advances in plenary talks, and student/postdoc participation. We encourage a diverse group of advanced graduate students and postdocs interested in attending to apply. The deadline for application is OCTOBER 15, 2010.

Ed Bertschinger, Gravity on Cosmic Scales
Neal Katz, Galaxy Formation
Mark Trodden, Particle Physics, LHC, and Cosmology
Licia Verde, Statistical and Numerical Methods in Cosmology
Martin White, Nonlinear Structure in the Universe

to be announced

Organized by the Berkeley Center for Cosmological Physics and Instituto Avanzado de Cosmologia, Mexico.

News From The Front, VII: What is Fundamental, Anyway? July 4, 2009

Posted by CGR Section Chair in : Cosmology, Gravitation, and Relativity (CGR), Nuclear and Particle Physics (NPP) , add a comment

Editor’s note: The following excerpt comes to us from theoretical physicist Clifford Johnson, a professor in the University of Southern California Department of Physics and Astronomy. Professor Johnson’s work primarily focuses on (super)string theory, gravity, gauge theory and M-theory. — CPW

One of the words I dislike most in my field – or more accurately, a common usage thereof – is “fundamental”. This is because it is usually used as a weapon, very often by people in my area of physics (largely concerned with particle physics, high energy physics, origins questions and so forth), to dismiss the work of others as somehow uninteresting or irrelevant. I don’t like this. Never have. Not only is it often allied to a great deal of arrogance and misplaced swagger, it is often just plain short-sighted, since you never know where good ideas and techniques will come from. A glance at the history of physics shows just how much cross-pollination there is between fields in terms of ideas and techniques. You never know for sure where valuable insights into certain kinds of problems may come from.

Fundamental physics is a term I used to hear used a lot to refer to particle physics (also called high energy physics a lot more these days). This was especially true some years back when I was an undergraduate in the UK, and it persisted in graduate school too, and is still in use today, although I think it is declining a bit in favour of less loaded terms. Somehow, a lot of particle physics is regarded as being all about the “what is everything made of at the very smallest scales” sort of question, first discussing atoms, and then atoms being made of electrons surrounding a nucleus, and the nucleus being made of protons and neutrons, and those in turn being made of quarks, and so on, in this was arriving at a list of “fundamental” particles. There’s the parallel discussion about the “fundamental” forces (e.g., electromagnetism and the nuclear forces) being described in terms of exchanges of particles like photons, gluons, and W and Z particles and so forth. There’s no real harm in the use of the term fundamental in this context, but this is about where the word gets elevated beyond its usefulness and starts becoming a hurdle to progress, and then a barrier. Somehow, “fundamental”, meaning “building block” gets turned, oddly, into “most important”. The issue of what the smallest building blocks are gets elevated to the most important quest, when it is in reality only a component of the story. It is rather like saying that the most important things about the Taj Mahal are the beautiful stones, tiles, and other components from which it is constructed.

Perspectives have evolved a bit since my salad days, with the rise of wider recognition of the connection between particle physics, and astrophysics and cosmology. I think that things are (these days) more widely seen to be the more rich interconnected and beautiful landscape of phenomena that they are, but I still find, especially among younger people, the “building block” attitude to be prevalent.

I raise this since sometimes I find that people don’t understand that there are fundamental and vital questions in other areas that connect to so many interesting areas of physics. […]

Read the rest of the article on Asymptotia here.

The Nature of Time March 14, 2009

Posted by CGR Section Chair in : Astronomy and Astrophysics (ASTRO), Cosmology, Gravitation, and Relativity (CGR), Nuclear and Particle Physics (NPP) , add a comment

Arguably one of the greatest and most fundamental problems in cosmology (alright, alright, all of physics) is trying to understand time. What is it? Why does the arrow of time only point in one direction? Because these questions exist and so do physicists, the study of time is an active field of research. It is a multidisciplinary field, with both physicists and philosophers contributing to it. Because the research is esoteric, finding funding for it is sometimes difficult, which is where organizations like FQXi step in.

FQXi is a vaguely controversial organization funded by the Templeton Foundation (but run by very well-respected physicists) that gives money to scientists who do research on fundamental questions in physics. Recently they had an essay contest, and the topic was the nature of time.

The winning essay is by Julian Barbour, a physicist and philosopher in Oxford, UK. The essay jury commended his essay:

The jury panel admired this essay for its crystal-clear and engaging presentation of a problem in classical dynamics, namely to find a measure for duration or the size of a time interval. The paper argues lucidly, and in a historically well-informed manner, that an appropriate choice for such a measure is not to be found in Newton’s pre-existing absolute notion of time, but rather emerges, in the form of ephemeris time, from the observable motions and the assumption of energy conservation. The paper also suggests how this emergence of duration might be relevant to problems in quantum gravity.

All of the winning essays can be found on the fqxi website. You can also read all of the submissions, including the ones that did not receive prizes. I strongly encourage all physicists, from undergrads to professors emeriti to have a look at the latest in the study of time!

Doing Business with DOE February 10, 2009

Posted by NPP Section Chair in : Acoustics (ACOU), Astronomy and Astrophysics (ASTRO), Atomic, Molecular and Optical Physics (AMO), Chemical and Biological Physics (CBP), Condensed Matter and Materials Physics (CMMP), Cosmology, Gravitation, and Relativity (CGR), Earth and Planetary Systems Sciences (EPSS), Fluid and Plasma Physics (FPP), Mathematical and Computational Physics (MCP), Nuclear and Particle Physics (NPP), Photonics and Optics (POP), Physics Education Research (PER) , add a comment



· Paid undergraduate science research internships?

· Summer research positions for faculty and student teams at a national laboratory?

· Careers with the Federal government or national laboratories?

· Graduate fellowships and Post-Doc appointments?

The Department of Energy is looking for you…

Come see us in the DOE Pavilion

Learn how you can work alongside scientists and engineers experienced at mentoring who want to transfer science knowledge by collaborative research. These programs are for undergraduate students from four year institutions, community colleges, or for students who are preparing to become K-12 science, math or technology teachers and for undergraduate faculty. Internships are available at all DOE national labs.

Up to 8 qualified undergraduate students will be considered for placement in the summer of 2009. The laboratories also have graduate and post-doc opportunities. We look forward to seeing you in Nashville! Please come join us at Booth 304 and the other booths in the DOE Pavilion in the Exhibit Hall Thursday and Friday or at any of the following activities and workshops:

Physics Diversity Summit: Discussion with Bill Valdez, Director, Office of Workforce Development for Teachers and Scientists

Date: Wednesday, February 11

Time: 2:00 PM

Workshop: Brookhaven National Laboratory –On Using Photons

Date: Thursday, February 12

Time: 2:00 – 3:30 PM and 4:00 – 5:30 PM

Workshop: Oakridge National Laboratory—On Using Neutrons

Date: Friday, February 13
Time: 3:00 PM – 4:30 PM; 5:00-6:30 PM

Doing Business with Department of Energy—Research and Grants

Date: Friday, February 13

Time: 3:00 – 4:30 PM