Framework to Embed Least Squares Solvers

Several available least-squares solvers could be included, among them

• least-squares solvers based on direct search algorithms, e.g., the Simplex algorithm;

• Powell's direction method;

• differential corrections with the Levenberg-Marquardt option;

• damped GauB-Newton algorithm for constrained least-squares problems; and

• sequential quadratic programming.

The operational implementation amid such a group of solvers would depend on the needs of the analysis. For instance, it could be advantageous to do an initial search with the Simplex algorithm and then switch to the GauB-Newton algorithm. The switching could be automatic with control of the switch to invoke any specific least-squares solvers in the subroutine DETPAR. This subroutine would have the following major input data which are passed to the least-squares solvers:

• the number of data points to be considered in the least-squares fit, N;

• the number of adjustable parameters, M;

• the number of constraints, MCON;

• the vector FOBS containing all individual observed data points, e.g., all light curves and all radial velocity curves;

• the vector FWEIGH containing the weights associated with the individual data points. If the weights are lumped weights, they have to be computed in advance before calling DETPAR;

• an initial guess x0 for the parameter vector to be determined. The vector s would be mapped onto x by the use of information from a vector KEEP, which would specify whether or not a parameter is adjustable; and

• control parameters specifying accuracy and termination criterion.

The output should consist of the

• vector x containing the parameters;

• vector e(x) containing the statistical errors of the parameters; and

• status flag to interpret the results.

Each individual least-squares solver would access the light curve model through a subroutine F. This subroutine would require N, M, and MCON as passed to DETPAR, and in addition it could find the set of phase values in a common block.

The output would consist of the

• vector FX = {Lcal, RVcal,...}: all computed light curves and radial velocity curves and

• status flag to interpret the results.

The subroutine F would again call subroutine LC, take the output of subroutine LC, i.e., all light curves and radial velocity curves, and store it into the vector FX. If several light curve models were implemented, subroutine F would find this information in a common block.

9.2 Procedural Philosophies

Experto credite (Believe one who knows by experience)

How might a general light curve analysis program be used? It should help the user to process and analyze his data (we assume that photometric data have been reduced and transformed to a standard system as per the precepts of Chap. 2 and that any other data have been suitably standardized as well). It should be able to make best use of anything that is known about the system, such as its distance, amount of interstellar extinction, and reddening.

The first step is to import the data and to process them into standard format. The format would depend on the type of data. For photometric data, it would consist of triads of Julian Date or phase; differential or standard-system magnitudes, flux level, or energy; and a suitable weighting factor. The data would then be transformed into a standard form, such as triads of Phase (if the period, epoch, and any variations of these elements are available) and relative light and weight.

The data would then be analyzed according to a selected light curve model. If none is specified, a series of models could be tried, beginning with a simple one that provides some rough initial parameter values. The preliminary parameter values could then be used as initial input into a more sophisticated analysis using a model based on Roche geometry. The program might request a selection of stellar atmospheres, or information supplied earlier could be used to select appropriate stellar atmospheres.

Depending on the characteristics of the binary system, the appropriate physics would be attached to the model and the data reanalyzed. The result could be

1. geometrical parameters;

2. stellar parameters;

3. physical parameters describing gas streams, disks, or other physical objects; and

4. a statistical analysis including error limits for all parameters.

If the general light curve analysis program were used in this way, the results could readily be compared to those of other authors. Such an analysis program could set a standard for describing and archiving the properties of eclipsing binary stars in general.

9.3 Code Maintenance and Modification

Remis velisque (With oars and sails; with all one's might)

The construction of any program requires a method by which changes can be made, implemented, and remembered. Because science is usually progressive, model improvements are inevitable. A prudent modeler will plan for them. Changes are easier, faster, and safer to implement if the program is modularized.

An important issue to resolve is: Who will make the changes, check the code for unexpected consequences or bugs, issue new version numbers, and distribute the code? Code modifications require a central authority to take responsibility to document and upgrade improvements, and, if our precepts regarding the modularization of the code are accepted, to incorporate additional modules and subroutines as they are developed. For data-acquisition and data-reduction packages, international observatories (KPNO + CTIO = AURA and ESO) take on this role. There is, however, no natural agency for a similar role to take over light curve analysis code development. Indeed, there should be no way to prevent someone from modifying a code to accomplish some particular task. But, if this is done, it is important that all such modifications be identified, especially in publications that make use of the result. While we find it unpalatable that a superauthority police the use of a code, there is a case for a code creator to seek copyright or other protection for its use to ensure that anyone modifying the code could be obliged to identify and document changes in any publication that emerges. The modifying party could be required to embed such documentation in comments before passing her/his modified code to anyone else. This protocol could reduce confusion about code legitimacy and performance.

Another possible mechanism to ensure responsible use and modification of programs would be to invoke the moral authority of the IAU through resolutions suggesting these steps at a General Assembly: Commissions 42 (Close Binary Stars), 26 (Double Stars), 27 (Variable Stars), and 25 (Photometry and Polarimetry). Perhaps others would be likely sources of such resolutions. In any case, we recommend a naming scheme for identifying successive versions and upgrades. This would simplify future investigators' work associating solutions and program versions.

Technology has settled at least one class of questions. Such questions as how upgrades could be announced and distributed (so troubling in past decades) are now rendered trivial because of the widespread communication media of the Internet and the World Wide Web.

The question of who will be responsible for maintaining the web pages and providing public access to various versions is still unresolved. In the case of a living creator, there is presumably no doubt; nor would there be any if an "heir" were designated by the progenitor. In the case that no such heir is designated, we recommend again that the relevant IAU commissions (mainly 42 - Close Binary Stars, but also 26 - Double Stars, and perhaps others) examine the question and designate an individual or group to take on the responsibility of code maintenance, upgrade, and distribution. One possibility is the creation of a multicommission Working Group on Data Analysis.

9.4 Prospects and Expectations

Tempora mutantur, et nos mutamur in illis

(The times change, and we along with them)

Modern light curve analysis was born early in the twentieth century with the pioneering work of Henry Norris Russell. Thanks first to the theoretical investigations of Zdenek Kopal beginning in the 1940s, and second to the development of practical computer applications by a number of workers two decades later, the field underwent a dramatic revolution in the 1970s. Since then light curve analysis has taken full advantage of the remarkable progress in both computer hardware and software. The Wilson-Devinney program has become the light curve analysis tool of choice by the majority of the community, and the line of improved modeling and models extends forward toward the horizon.

It seems likely that the number of observed light curves will continue to exceed the number analyzed. Indeed, the use of CCDs of increasingly large format threaten to overwhelm us with light curve data. Thousands of light curves are beginning to emerge from large field imaging projects, such as OGLE and MACHO, which have been developed to find gravitationally lensed objects. Unfortunately, the analysis of binaries from such sources suffers from the lack of radial velocities as well as spectral and color index data. Nevertheless, as image processing codes provide reliable magnitudes for more and more stars, the number of light curves will exhaust our ability to analyze them unless concomitant ways of providing fast and reliable analysis can be developed. Millions are expected by the Kepler or GAIA mission, or ground-based survey such as the LSST. Section 5.3 contains promising techniques to approach this challenge. Perhaps neural networks could be used to identify the general type of light curve, and then adequate light curve models used to attack the subtleties, and provide solutions. Neural network techniques are most advantageous when the database of well-studied light curves is very large, since only a large database can provide an adequate training set.

The quality of light curve analyses can also be expected to improve. Accurate stellar atmospheres over expanded temperature and wavelength regimes will provide the means to model the fluxes of binary components with ever-increasing accuracy. Finally, an increasing range of types of objects and astrophysical conditions can be expected to be modeled successfully with a package of standardized programs. The phenomena of extended atmospheres, semi-transparent atmospheric clouds, variable thickness disks, and gas streams, and a plethora of planetary transits as well as (from infrared data) occultations are among such programs.

What does the long-term future hold? Besides the determination of orbits, stellar sizes, and masses it seems likely that the detailed physics of stellar surfaces, including those arising from activity cycles, will continue to be targets of modeling work. It also seems likely that diagnostic tools will continue to be developed within light curve codes to provide more insight into stellar astrophysics. The development of ever more accurate stellar atmospheres is the key to the successful use of the analysis codes for the exploration of the radiative properties of the stars. It is crucial in transforming synthetic light curve codes into a diagnostic tool of great power. Such developments will lead to ever better tools for the elucidation and understanding of stars.

Finally, close binary research might initiate projects involving complicated physics and requiring sophisticated mathematics or huge number crunching. Here, we mention six problems:

1. The structural-dynamic readjustment of tidally distorted stars in eccentric orbits. To what extent does stellar volume vary in response to forced nonradial oscillations?

2. Computation of binary star interiors in terms of three-dimensional structural dynamics: How is matter redistributed in a binary component that almost fills its limiting lobe and is close to the beginning of mass loss through the Lagrangian point?

3. Three-dimensional radiation-hydrodynamic problems in binary systems. This includes meridional circulation and stellar winds and would treat the radiative transfer without geometric assumptions.

4. A number of sources of spectral emission could be modeled usefully. The presence of coronal plumes in systems with solar-like and later spectral type stars is one of these. Another is the stream in morphologically defined Algol systems. A third is the extensive disks around W Serpentis systems or around white dwarfs in CV systems. Such complex structures, however, are not easily characterized by one or even two parameters. However, there is software around that treats these kinds of objects.

5. Investigation of dynamically evolving configurations that result from tides in eccentric orbits, as in high-mass X-ray binaries such as GP Velorum/Vela-Xl, Centauras X-3, and V884 Scorpii (HD 153919) with an optical O6.5 supergiant and 6.m5 V magnitude.

6. Adjustable EB parameters that change in time: Currently, the rate dP/dt of period change and orbit rotation drn/dt (apsidal motion) are the only parameters which are traced in time. Third bodies inducing Kozai cycles as discussed in Sect. 5.2.1 can produce significant changes in binary's eccentricity and inclination [cf. SS Lac studied, for instance, by Torres & Stefanik (2000), Milone et al. (2000), or Eggleton & Kiseleva-Eggleton (2001)]. Whereas the longer period in Kozai cycles related to circularization and shrinkage of the orbit is of the order of 100,000 years, the shorter cycles from large to small and again back to large eccentricities is of the order of 1,000 years. Thus, the effects are measurable over decades. Another group of adjustable parameters that would be useful for systems such as V781 Tau analyzed by Kallrath et al. (2006) are curve-dependent spot parameters and allowance for differential stellar rotation and latitude migration of spot groups. The idea here is that the curves have been observed during times with different spots on the stars. This work has been done for single stars by Harmon & Crews (2000) and by the Catania astronomers.

These examples reemphasize that close binaries are not only rich in physics but the ongoing need to extract the full measure of information contained in the data, leads, in turn to progress in the mathematical and numerical methods used in astrophysics. Vive l'astrophysique!

References

Eggleton, P. P. & Kiseleva-Eggleton, L.: 2001, Orbital Evolution in Binary and Triple Stars, with an Application to SS Lacertae, ApJ 562, 1012-1030

Harmon, R. O. & Crews, L. J.: 2000, Imaging Stellar Surfaces via Matrix Light-Curve Inversion, AJ120, 3274-3294

Kallrath, J., Milone, E. F., Breinhorst, R. A., Wilson, R. E., Schnell, A., & Purgathofer, A.: 2006, V781 Tauri: A W Ursae Majoris Binary with Decreasing Period, Astronomy and Astropyhsics 452,959-967

Milone, E. F., Schiller, S. J., Munari, U., & Kallrath, J.: 2000, Analysis of the Currently Noneclips-ing Binary SS Lacertae or SS Lacertae's Eclipses, Astron. J. 199, 1405-1423

Torres, G. & Stefanik, R. P.: 2000, The Cessation of Eclipses in SS Lacertae: The Mystery Solved, AJ 119, 1914-1929

Telescopes Mastery

Telescopes Mastery

Through this ebook, you are going to learn what you will need to know all about the telescopes that can provide a fun and rewarding hobby for you and your family!

Get My Free Ebook


Post a comment