PEMROSESAN "FEATURE DETECTION / DETEKSI FITUR / IMAGE SCANNING PROCESS" PADA ARTIFICIAL INTELLIGENCE (AI) DALAM OTAK MANUSIA PDF

Title PEMROSESAN "FEATURE DETECTION / DETEKSI FITUR / IMAGE SCANNING PROCESS" PADA ARTIFICIAL INTELLIGENCE (AI) DALAM OTAK MANUSIA
Author D. Sandika
Pages 20
File Size 1.6 MB
File Type PDF
Total Downloads 586
Total Views 663

Summary

Volume 71 B, number 2 PHYSICS LETTERS 21 November 1977 THE ESSENTIAL DECAY OF PANDEMONIUM: A DEMONSTRATION OF E R R O R S IN C O M P L E X B E T A - D E C A Y S C H E M E S J.C. HARDY *, L.C. CARRAZ, B. JONSON ~ and P.G. HANSEN :~ CERN, Geneva, Switzerland Received 14 September 1977 The ~3-decayof a...


Description

Volume 71 B, number 2

PHYSICS LETTERS

21 November 1977

THE ESSENTIAL DECAY OF PANDEMONIUM:

A DEMONSTRATION OF E R R O R S IN C O M P L E X B E T A - D E C A Y S C H E M E S J.C. HARDY *, L.C. CARRAZ, B. JONSON ~ and P.G. HANSEN :~ CERN, Geneva, Switzerland

Received 14 September 1977 The ~3-decayof a fictional nuclide, Pandemonium, is created numerically using a statistical model. By analyzing its simulated "r-ray spectrum as experimental data, we find that much v-ray intensity actually remains unobserved under normal experimental conditions. This result - illustrated for the case of 14 s Gd - casts doubt on many decay schemes determined from such spectra. If many ~3-transitions contribute to the decay of a nucleus, their properties cannot be studied in detail through direct observation of the continuous spectrum of emitted electrons. Instead, most experiments concentrate on the subsequently emitted 7-radiation, whose spectrum is characterized by discrete energy groups. If high-resolution detectors are used, it is usually supposed that any ~-transition not leaving its mark in the form of a 3'-ray peak must make little contribution to the decay scheme. This is undoubtedly true if the/3-decay energy is low and the number of transitions, consequently few. But can we expect an intuition valid for these conditions to be equally reliable when the decay energy is high and many final states are available? The states being populated must then be comparable in their close spacing to those encountered in neutron resonance studies, for which elaborate statistical procedures have been developed [1 ]. Can these really be ignored in "y-ray spectroscopy? Such considerations have not prevented "),-ray spectroscopy from being used to unravel even the most complex/3-decay schemes, as a look at almost any copy of the Nuclear Data Sheets will confirm. Yet, little attention has been paid to the possibility of errors inherent to the method itself - quite apart, that is, from the more obvious possibilities of errors

* On leave from AECL, Chalk River, Canada. :~On leave from Department of Physics, Chalmers University of Technology, G6teborg, Sweden. * On leave from Institute of Physics, University of Aarhus, Aarhus, Denmark.

due to misassigned 7-rays, undetected conversion and the like. Presumably, in the absence of alternative methods for accurately measuring/3-transition rates, little opportunity exists for realizing limitations in the existing approach. If one knew some transition rates a priori, then the success of the experimental method could be tested in attempting to reproduce those rates. Although nature is parsimonious with such a priori knowledge, we need not find ourselves so restricted. Our expedient is simply to create a fictional nucleus and then simulate its decay numerically, imposing upon the process only those broad statistical principles known to circumscribe nuclear excitation and decay [1,2]. We have named this "unstructured" fictional nucleus Pandemonium * 1. Since its existence is confined to the functioning of a computer program, we can on the one hand know the Pandemonium/3-transition rates exactly, while on the other hand generate for analysis a simulated spectrum of/3-delayed 7-rays. For the present illustrative calculation we have chosen to assign to Pandemonium A = 145 a n d Z = 64, the mass and atomic number of 145Gd. This choice was made partly because the decay of such a mediummass nucleus is representative of many complex/3decay schemes and partly because the 145Gd decay has already attracted general interest through its use

"The rest were all Far to the inland retired, about the wails Of Pandemonium city and proud seat Of Lucifer." J. Milton in Paradise Lost X (1667) line 424 307

Volume 71B, number 2

PHYSICS LETTERS l

N~ ,os

21 November 1977

2'5

310

104

10-

31.5

,

4.0

%111111 i 4.5

E

Fig. 1. Computer simulated ~'-ray spectrum corresponding to the fictional decay of Pandemonium, which in this application has been assigned the mass and atomic number of 14SGd. in determining apparently anomalous EC//3 + ratios [3,4]. The decay o f Pandemonium was computed using Monte Carlo techniques. First, states of appropriate spin were generated in the daughter nucleus; their density was taken from the formulas of Gilbert and Cameron [5], with level-to-level distances scattered randomly according to the Wigner law [2]. Then, Gamow-Teller transition probabilities to these states were generated with a random Porter-Thomas distribution [2], the energy dependence of the average value being set by assuming a constant/3-decay strength function , 2 . The spin o f Pandemonium was taken to be 1+ (as it is for 145Gd) and its total decay energy, 3+ QEC = 5.0 MeV [6]. The g states populated in the daughter were assumed to 7-decay directly to the 1+ ground state while ~ states decay to a state at 800 keV; though somewhat simplified, this approximates the actual situation observed [3] for 145Eu. Within this framework, a complete/3-decay scheme with attendant 7-ray emission could be created from random 4~2 Real nuclei will, of course, show more or less pronounced effects from detailed nuclear structure. Apart from transitions to states at low excitation energy, which we shall not be considering anyway, these effects should appear as broad topographical features in the strength function [2]. It will become evident that our conclusions are independent of such relatively slow changes. 308

numbers generated by the computer. The 7-ray spectrum shown in fig. 1 was then simulated by using a 3 keV (FWHM) Gaussian resolution function and double- (but not single-) escape peaks, the Compton distribution being introduced numerically from the measured [7] distribution for a monoenergetic ")'-ray. The energy-dependent efficiency o f a real Ge(Li) detector was also incorporated, with the photo-peak efficiency assumed for simplicity in analysis - to be strictly proportional to E~-1 . The number o f counts per channel (= 1 keV) was chosen to approximate typical experimental conditions, and the effect of Poisson counting statistics was superimposed. Treating the spectrum o f fig. 1 as experimental data, which it closely resembles, we processed it in a standard manner using the peak analysis program SAMPO [8]. Because of the purely statistical nature of the Pandemonium decay, we did not consider it relevant to deal with low energy "}'-rays. For those above 1.7 MeV the results (data set I) appear in table 1, together with the results of three other "experiments". First, the same decay scheme was "observed" in a spectrum with one tenth the number of counts (set I : 10), and then in one with ten times the number (set I × 10); in both cases the counting statistics were altered appropriately. Finally, another complete decay scheme was generated (set II), differ-

Volume 71B, number 2

PHYSICS LETTERS

Table 1 Results obtained from analyzing the simulated 7-ray spectrum (for E7/> 1.7 MeV) from the fictional decay of Pandemonium Data set

No. of photopeaks detected

Intensity observed

I I : 10 I × 10

35 8 81

86% 55%

II a)

70

94% 95%

experiment [31

32

9

30~



T

~

r

gr,4 20 >i--

o J

a) The total number of counts in data set II corresponded most nearly with that of set I × 10. Both have ~106 counts in each of a number of peaks above 1.7 MeV and thus represent more complete data than most real experiments can produce. ing from the first only in the random numbers involved in the calculation, and the corresponding 3'-ray spectrum was analyzed. As might be expected, the number of T-rays observed depends upon the counting statistics. However, none o f the "experiments" observe even as many as I(F~ o f the ~ 1 0 0 0 T-ray transitions actually present in the decay. Still more significant is the total intensity of the unobserved transitions: in the spectrum o f fig. 1 (data set I), for example, 14% of the true T-ray intensity above 1.7 MeV is completely undetected. In fig. 2a the 3'-ray intensity actually detected as peaks in set I is compared with the true intensity as a function of T-ray energy. The corresponding "observation efficiency" for the detection of T-ray intensity appears in fig. 2b, as does the equivalent information for the different counting statistics of sets I : 10 and I X 10. The evident decrease with higher T-ray energy can be attributed to the increasing level density, which disperses the Pandemonium/3-transition strength and results in a large number o f relatively weak, often unresolved, 3,-rays. The few peaks remaining above this continuum result from the skewness of the statistical distribution of transition probabilities, a familiar feature o f Porter-Thomas statistics. For the decay of 145Gd, the present results have considerable significance. The accepted decay scheme, upon which ECD3+ ratios were based [4], was in its turn based upon a spectrum of iS-delayed T-rays [3]. The number of counts recorded in that spectrum places it, for comparison, between our data sets I and

21 November 1977

0

~-"

i

0

i

i

"

\

I

\

\

uJ

:

s o~L

\

\

~) 0 2

\ ~i

20

30 40 y - RAY ENERGY (MeV)

50

Fig. 2. a) Observed and true 7-ray intensity (shaded and unshaded histograms, respectively) for data set I plotted in intervals of 200 keV of 7-ray energy. The ordinate is scaled to give a total of 100% true intensity above 1.7 MeV. b) The circles correspond to the data in fig. 2(a) expressed as an "observation efficiency" for the detection of y-ray intensity. The solid line is drawn simply to indicate the trend of the results. The two dashed curves show the corresponding trends for data set I : 10 (lower efficiencies) and I X 10. Sets II and I X l0 were indistinguishable. Note that ETmax = QEC = 5.0 MeV.

I " 10. Its similarity with the Pandemonium spectrum can be verified visually and by noting that the experimenters' own analysis yielded [3] 30 photo-peaks above 1.7 MeV (cf. table 1). By analogy with Pandemonium, then, one must conclude that ~20% of the T-ray intensity above 1.7 MeV from 145Gd was not even recognized, let alone placed in the decay scheme. Already large, this proportion must still be an underestimate. In our Pandemonium calculation we have assumed that each excited state populated in the daughter subsequently decays by emitting a single 309

Volume 71B, number 2

PHYSICS LETTERS

7-ray. Actual decay schemes are rarely so simple, and complexity only serves to disperse the total intensity further, thus making even less of it visible in the form of 7-ray peaks. The essential aspects of these conclusions are confirmed by a recent remeasurement [6] of the 145 Gd 7-spectrum with about 10 times the number of counts observed in ref. [3]. A peak analysis yielded ~20% more 7-ray intensity above 1.7 MeV. According to our calculations there must be still more intensity undetected, but even without it the evidence for EC//3+ anomalies is removed [6]. Obviously our results have wider implications than simply to the decay of 145Gd. Every complex/3-decay scheme that is based on 7-ray peak analysis and intensity balances must now be regarded as doubtful. In such schemes, the 13-decay feeding to each level is assumed to be the difference between the total 7-ray intensity depopulating the level and that seen feeding it. If significant 7-ray intensity remains unobserved, these differences are incomplete and the derived/3decay branching ratios, for all but the strongest transitions, could be wrong by orders of magnitude. In discrediting the "measured" ft values for most/3-transi-

310

21 November 1977

tions in complex decay schemes, this conclusion reflects on a large body of existing data, and surely indicates the need to reevaluate the usefulness of a whole class of experiments.

References [1] J.E. Lynn, The theory of neutron resonance reactions (Clarendon Press, Oxford, 1968). [2] B. Jonson et al., Proc. 3rd Int. Conf. on Nuclei far from stability, CERN Report 76-13 (1976) 277. [3] R.E. Eppley, W.C. McHarris and W.H. Kelly, Phys. Rev. C3 (1971) 282. [4] R.B. Firestone, R.A. Warner, W.C. McHarris and W.H. Kelly, Phys. Rev. Lett. 33 (1974) 30 and 35 (1975) 401. [5] A. Gilbert and A.G.W. Cameron, Can. J. Phys. 43 (1965) 1446; J.W. Truran, A.G.W. Cameron and E. Hilf, CERN Report 70-30 (1970) 275. [6] P. Hornsh¢j, H.L. Nielsen and N. Rud, Phys. Rev. Lett. (to be published) and private communication. [7] R.L. Heath, Gamma ray spectrum catalogue (Aerojet Nuclear Co., 1974). [8] J.T. Routti and S.G. Prussin, Nucl. Instr. 72 (1969) 125.

Perception, 1978, volume 7, pages 97-104

Pandemonium and visual search

Leslie Henderson Psychology Group, Hatfield Polytechnic, PO Box 109, Hatfield AL10 9AB, England Received 14 February 1977

Abstract. Pandemonium-like models have played a central role in theories of perceptual recognition. One model is examined which asserts that information is sorted unidirectionally through a hierarchy of increasingly abstract levels only to a depth required by the logical demands of the task and read off from the appropriate level to control response decisions. The support originally claimed for the model in terms of its application to visual search performance is questioned. It is suggested that the pervasiveness of such models is not due to their competition with alternative theories but rather to methatheoretic considerations. " 0 where have you been, my long, long love, This long seven years and mair?" The Demon-lover, A Border Ballad

A remarkable aspect of psychological theorising about pattern recognition throughout the sixties was the dominance of a Pandemonium-like model as an overview, almost a metatheory, of the organisation of information processing in perceptual recognition. This widely held but often implicit commitment to a particular hierarchical model stood somehow apart from the ongoing disputes about experimental outcomes and their theoretically local interpretation. It is the the purpose of this note to reexamine the classic invocation of Pandemonium as an account of visual search (Neisser 1963, 1964, 1967). As a prelude I shall attempt briefly to explicate the significant features of the theory as it was applied to search performance. I shall end with a few rather general speculations about the reasons why the hierarchical model has so dominated the thinking of experimental psychologists. Four principles characterise the model. These are its hierarchical organisation, its bottom-up access, the notion of penetration only to logically sufficient depth, and the availability of multiple readoff. Students of demonology will be familiar with the model's hierarchical organisation into a set of levels representing increasingly complex conjunctions of the sensory elements or, alternatively expressed, representing increasingly abstract and constrained categories. Such a structure in itself merely embodies an aspect of human knowledge while asserting little in particular about how we process information. The bottom-up principle adds, therefore, a statement about information flow, asserting that first access to the system is restricted to the elementary bottom layer of the pyramid and upward progress through the levels is unidirectional and contingent, that is, progress depends on the outcomes at previous levels. Next we come to the principle of economy, which asserts that information is only processed through the system to a level which is logically sufficient for a decision to be made. Finally, a principle which links the previous principle to task performance asserts that processing outcomes at any level can be read-off in order to control performance decisions appropriate to that level. The Pandemonium concept originated in Artificial Intelligence work intended to specify a competent recognition system. The first two principles, which deal with the hierarchic architecture of the system and the bottom-up flow of information, are essential to its mode of operation. However, the second pair of principles, which deal with economy of depth and multiple read-off, derive from the fitting of the

98

L Henderson

model to human search performance. They are psychological rather than logical in provenance. Let us turn, then, to Neisser's crucial analysis of visual search. [I shall generally follow the argument of Neisser (1964) but this choice of source is not significant.] Neisser begins his exposition by bringing out the commonality between search, figure-ground segregation, and attention which later formed the central theme of his book (Neisser 1967). To illustrate how differing depths of processing may be required in such tasks, he contrasts the problem of detecting a parachutist against a bright sky with the problem of detecting the member of a set of numbers which is a multiple of seven. Neisser's choice of contrast is, perhaps, revealing because this particular contrast correlates two distinctions. One hinges on the sufficiency of "elementary visual mechanisms for distinguishing contrast and contour... to locate the parachutist". The other involves the fact that the parachutist as target is directly perceived, in Julesz' (1975) sense, whereas the multiple-of-seven as target has to be computed. However, it is those typical but unmentioned cases where a directly perceived quantity such as figure-ground cannot obviously be arrived at solely by elementary feature analysis which seem to pose a problem for exclusively bottom-up models [for example the dalmatian dog on page 8 of Lindsay and Norman (1972)]. Neisser goes on to give his much quoted reasons for the economy principle. The first is that subjects see background items as "only a blur". This he interprets as showing that information about background items "penetrates the nervous system only far enough so that some subsystem (sufficient to detect non-target features) can have an opportunity to react". A related finding was that subjects were unable to recall background items and unable to detect change of the fixed set of background items even after several days' practice with the set. The analogy between these claims and those concerning the fate of unattended items in, for example, dichotic listening is evident (see Neisser 1976, pp 93-95), and the difficulty of establishing that identification does not take place in search seems as great as it is in the listening task. In fact subsequent studies of visual search (Gordon 1968; Gleitman and Jonides 1976) have tended to find partial recognition of background characters. The assumption that incidental learning directly reflects the depth of encoding is not only shared with traditional filter interpretations of dichotic listening but is also central to a recent and influential theory of memory (Craik and Lockhart 1972). It is interesting to note therefore that in their revision of the memory theory Craik and Tulving (1975) go some way toward abandoning the depth-of-processing notion in favour of a concept of the amount of elaboration of encoding. Furthermore Coltheart (1977) has shown that simply measuring the amount of incidental learning as a function of different orienting tasks in the Craik and Tulving manner gives no reliable indication of the nature or depth of code used; she found both acoustic and semantic confusions with either type of orienting task. Neisser refers to even more compelling reasons for believing that processing of background items stops shor...


Similar Free PDFs