text stringlengths 1 22.8M |
|---|
Hallelujah! is a British television sitcom produced by Yorkshire Television that aired on ITV from 29 April 1983 to 21 December 1984.
The series was set in a Salvation Army citadel in the fictional Yorkshire town of Brigthorpe during series 1 (and later in the fictional place of Blackwick in series 2). Captain Emily Ridley (Thora Hird) has been posted there, having been an active member of the Salvation Army for 42 years. Despite the town and residents being seemingly pleasant, Emily is determined to flush out sin from behind the net curtains. Assisting Emily is her niece Alice Meredith (Patsy Rowlands).
The programme was a repeat collaboration between Hird and the creator Dick Sharples, who had worked together on the comedy series In Loving Memory between 1979 and 1986.
The series also featured guest appearances from the likes of Hird's Last of the Summer Wine co-star Michael Aldridge and television presenter Richard Whiteley.
Plot
The show was set in the Salvation Army based in the fictional Yorkshire town of Brigthorpe during Series 1, and in the equally fictional Yorkshire place of Blackwick during Series 2. Dame Thora Hird starred as Captain Emily Ridley, with Patsy Rowlands as her niece Alice Meredith and Rosamund Greenwood as Sister Dorothy Smith (who left after the first series and was later replaced by David Daker as Brother Benjamin in the second series).
A notable characteristic of the show was that every episode ended with the audience clapping once during the closing sequence throughout its year-long run.
Cast
Thora Hird - Captain Emily Ridley
Walter Gotell - Lt. Colonel Henderson
Patsy Rowlands - Alice Meredith
Rosamund Greenwood - Sister Dorothy Smith (series 1)
David Daker - Brother Benjamin (series 2)
Geoffrey Bayldon - Mr Sedgewick
Michael Aldridge - Brig Langton (series 1)
Garfield Morgan - Brig Langton (series 2)
Episodes
Series 1 (1983)
Retirement (29 Apr 83)
Repentance (6 May 83)
Counselling (13 May 83)
Poor Box (20 May 83)
Luncheon Club (27 May 83)
Mobile Canteen (3 Jun 83)
Struck Down (10 Jun 83)
Series 2 (1984)
Marching Orders (2 Nov 84)
Just A Song At Twilight (9 Nov 84)
Holy Deadlock (16 Nov 84)
The Snake Pit : Part 1 (23 Nov 84)
The Snake Pit : Part 2 (30 Nov 84)
Rock Bottom (7 Dec 84)
It Happened One Night (14 Dec 84)
A Goose For Mrs. Scratchitt (Christmas Special) (21 Dec 84)
Only 15 episodes over two series were made though some sources claim that there were three series made. This is incorrect; there were only two series and one Christmas special shown between 1983 and 1984.
In 1985, it was reported that a continuation/sequel of the series was to be produced for BBC 2. This follow-on series ultimately never appeared; instead, Hird later that year signed up to play Edie Pegden, a character with a number of parallels to her Hallelujah! character Captain Emily Ridley, in Last of the Summer Wine, a role that was written especially for her by Roy Clarke when she became available.
Location filming
The series was filmed mostly in and around both Yorkshire Television Studios and the Leeds area. Most notable filming location was Leeds General Infirmary, especially appearances by old run-down buildings old and new around the Leeds-area at the time. Some scenes in the Christmas special were filmed in the Victorian street exhibition of the York Castle museum.
The opening sequence was filmed outside the Garden Gate pub in Hunslet.
Theme music
The theme music was performed by the James Shepherd Versatile Brass, conducted by Robert Hartley.
DVD release
DD Home Entertainment (now known as 'Simply Home Entertainment') released series 1 and 2 in 2008. They claimed that series two was complete at first, However, as the Christmas special was not included, they later dropped this claim (the artwork on the cover stayed the same however)
Both series of Hallelujah! and the Christmas Special are now available from Network DVD.
Hallelujah! has been released in Australia by Acorn Media Australia. It is a boxset with both series plus the Christmas special. It has been released as an all region DVD (PAL).
Notes
1. Lewisohn, Mark. Radio Times Guide to British Comedy p. 292.
External links
1983 British television series debuts
1984 British television series endings
ITV sitcoms
1980s British sitcoms
Television series by Yorkshire Television
Television shows set in Yorkshire
English-language television shows |
Nevado Ishinca, meaning "snow covered mountain", is a mountain peak located in the Cordillera Blanca mountain range in the Peruvian Andes. It is located in the Ishinca Valley region and has a summit elevation of 5,530 meters. Ishinca is most often climbed via its normal route, the North-West Route rated Alpine PD-. Ishinca's North-West Route was first ascended by J. Fonrouge, W. Lindaver, H. Salger, H. Schmidbauer and V. Staudacher in 1964.
Oronymy
At Anqash Runa Simi: ichinqa → ishinqa → ishinca (it will stand).
References
Mountains of Peru
Mountains of Ancash Region |
```c++
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#include <xxhash.h>
#include <algorithm>
#include <cmath>
#include "paddle/phi/backends/cpu/cpu_context.h"
#include "paddle/phi/core/dense_tensor.h"
#include "paddle/phi/core/kernel_registry.h"
#include "paddle/phi/kernels/funcs/math/bloomfilter.h"
#include "paddle/phi/kernels/funcs/search_compute.h"
namespace phi {
#ifndef _WIN32
bool should_use_term(phi::math::bloomfilter* _filter,
phi::math::bloomfilter* _black_filter,
const float* word_repr,
int len) {
return (!_filter || 1 == phi::math::bloomfilter_get(
_filter, word_repr, len * sizeof(float))) &&
(!_black_filter ||
0 == phi::math::bloomfilter_get(
_black_filter, word_repr, len * sizeof(float)));
}
template <typename T>
void hash_embedding_ff(const float* hash_id,
int len,
T* top_pos,
const T* weights,
int _num_emb,
int _rand_len,
int _space_len) {
unsigned int pos1 = XXH32(hash_id, len * sizeof(float), 0) % _space_len;
unsigned int pos2 =
XXH32(hash_id, len * sizeof(float), _rand_len) % _space_len;
for (int j = 0; j != _num_emb; j += _rand_len) {
if (j + _rand_len < _num_emb) {
__builtin_prefetch(weights + pos2);
__builtin_prefetch(top_pos + j + _rand_len);
}
unsigned int pos3 =
XXH32(hash_id, len * sizeof(float), j + 2 * _rand_len) % _space_len;
memcpy(top_pos + j, const_cast<T*>(weights + pos1), _rand_len * sizeof(T));
pos1 = pos2;
pos2 = pos3;
}
}
template <typename T, typename Context>
void CPUPyramidHashOPKernel(const Context& dev_ctx,
const DenseTensor& x,
const DenseTensor& w,
const DenseTensor& white_list,
const DenseTensor& black_list,
int num_emb,
int space_len,
int pyramid_layer,
int rand_len,
float drop_out_percent,
int is_training,
bool use_filter,
int white_list_len,
int black_list_len,
int seed,
float lr,
const std::string& distribute_update_vars,
DenseTensor* out,
DenseTensor* drop_pos,
DenseTensor* x_temp_out) {
auto* bottom = &x;
auto* _blobs_0 = &w;
auto* _blobs_1 = &white_list;
auto* _blobs_2 = &black_list;
auto* top = out;
int _num_emb = num_emb;
int _pyramid_layer = pyramid_layer;
int _is_training = is_training;
unsigned int _seed = (unsigned int)seed;
int _rand_len = rand_len;
int _space_len = space_len;
float _drop_out_percent = drop_out_percent;
const auto& offset = bottom->lod()[0];
const auto* bottom_data_ori = bottom->data<int32_t>();
auto* buff = x_temp_out;
buff->Resize(common::make_ddim({bottom->dims()[0], bottom->dims()[1]}));
float* bottom_data = dev_ctx.template Alloc<float>(buff);
for (int i = 0; i < bottom->dims()[0]; i++) {
bottom_data[i] = bottom_data_ori[i]; // NOLINT
}
const auto* weights = _blobs_0->data<T>();
std::vector<size_t> top_offset;
top_offset.resize(offset.size());
top_offset[0] = 0;
phi::math::bloomfilter* _filter = nullptr;
phi::math::bloomfilter* _black_filter = nullptr;
if (use_filter) {
if (white_list_len != 0) {
_filter = (phi::math::bloomfilter*)_blobs_1->data<float>();
PADDLE_ENFORCE_EQ(
phi::math::bloomfilter_check(_filter),
1,
common::errors::PreconditionNotMet(
"The white filter is not loaded successfully, please make sure "
"'white_list_len': %d is valid for Input(WhiteList).",
white_list_len));
}
if (black_list_len != 0) {
_black_filter = (phi::math::bloomfilter*)_blobs_2->data<float>();
PADDLE_ENFORCE_EQ(
phi::math::bloomfilter_check(_black_filter),
1,
common::errors::PreconditionNotMet(
"The black filter is not loaded successfully, please make sure "
"'black_list_len': %d is valid for Input(BlackList).",
black_list_len));
}
}
drop_pos->Resize(common::make_ddim(
{bottom->dims()[0] * bottom->dims()[1] * _pyramid_layer, 1}));
std::vector<size_t> drop_pos_offset;
drop_pos_offset.resize(offset.size());
drop_pos_offset[0] = 0;
int* iter = dev_ctx.template Alloc<int>(drop_pos);
int* iter_end = iter;
for (size_t i = 0; i < top_offset.size() - 1; ++i) {
int w = static_cast<int>(offset[i + 1] - offset[i]);
int nsentense_with_pyramid = 0;
if (w < 2) {
nsentense_with_pyramid = 0;
} else {
for (int ilayer = 1; ilayer < _pyramid_layer && ilayer < w; ++ilayer) {
for (int l = 0; l < w - ilayer; ++l) {
if (should_use_term(_filter,
_black_filter,
(const float*)(bottom_data + offset[i] + l),
ilayer + 1)) {
if (_is_training != 0) {
unsigned int rand_val = rand_r(&_seed);
double rate = static_cast<double>(rand_val) / (RAND_MAX);
*(iter_end++) = (rate < _drop_out_percent ? 0 : 1);
} else {
*(iter_end++) = 1;
}
} else {
*(iter_end++) = 0;
}
}
}
nsentense_with_pyramid = static_cast<int>(std::count(iter, iter_end, 1));
iter = iter_end;
}
drop_pos_offset[i + 1] = drop_pos_offset[i] + nsentense_with_pyramid;
top_offset[i + 1] =
top_offset[i] +
(nsentense_with_pyramid == 0 ? 1 : nsentense_with_pyramid);
}
int top_l = static_cast<int>(top_offset[top_offset.size() - 1]);
phi::LoD top_lod;
top_lod.push_back(top_offset);
top->set_lod(top_lod);
top->Resize(common::make_ddim({top_l, _num_emb}));
auto* top_data = dev_ctx.template Alloc<T>(top);
phi::LoD drop_pos_lod;
drop_pos_lod.push_back(drop_pos_offset);
drop_pos->set_lod(drop_pos_lod);
iter = dev_ctx.template Alloc<int>(drop_pos);
int top_counter = 0;
for (size_t i = 0; i < offset.size() - 1; ++i) {
int w_drop = static_cast<int>(drop_pos_offset[i + 1] - drop_pos_offset[i]);
int w = static_cast<int>(offset[i + 1] - offset[i]);
if (w_drop == 0) {
if (w >= 2) {
for (int ilayer = 1; ilayer < _pyramid_layer && ilayer < w; ++ilayer) {
for (int l = 0; l < w - ilayer; ++l) {
iter++;
}
}
}
auto* top_pos = top_data + top_counter++ * _num_emb;
memset(top_pos, 0, _num_emb * sizeof(T));
continue;
}
if (w >= 2) {
for (int ilayer = 1; ilayer < _pyramid_layer && ilayer < w; ++ilayer) {
for (int l = 0; l < w - ilayer; ++l) {
if (*(iter++) == 0) {
// do nothing
} else {
auto* top_pos = top_data + top_counter++ * _num_emb;
hash_embedding_ff<T>((const float*)(bottom_data + offset[i] + l),
ilayer + 1,
top_pos,
weights,
_num_emb,
_rand_len,
_space_len);
}
}
}
}
}
if (iter != iter_end) {
exit(1);
}
auto weight_type = phi::TransToProtoVarType(_blobs_0->dtype());
if (_is_training == 0 && weight_type != phi::ProtoDataType::INT8) {
phi::funcs::axpy_noadd(
top_data, top_data, top->dims()[0] * top->dims()[1], _drop_out_percent);
}
}
#endif
#ifdef _WIN32
template <typename T, typename Context>
void CPUPyramidHashOPKernel(const Context& dev_ctx,
const DenseTensor& x,
const DenseTensor& w,
const DenseTensor& white_list,
const DenseTensor& black_list,
int num_emb,
int space_len,
int pyramid_layer,
int rand_len,
float drop_out_percent,
int is_training,
bool use_filter,
int white_list_len,
int black_list_len,
int seed,
float lr,
const std::string& distribute_update_vars,
DenseTensor* out,
DenseTensor* drop_pos,
DenseTensor* x_temp_out) {}
#endif
} // namespace phi
PD_REGISTER_KERNEL(
pyramid_hash, CPU, ALL_LAYOUT, phi::CPUPyramidHashOPKernel, float, int8_t) {
kernel->InputAt(0).SetDataType(phi::DataType::INT32);
kernel->OutputAt(1).SetDataType(phi::DataType::INT32);
kernel->OutputAt(2).SetDataType(phi::DataType::FLOAT32);
}
``` |
```java
/*
* one or more contributor license agreements. See the NOTICE file distributed
* with this work for additional information regarding copyright ownership.
*/
package io.camunda.zeebe.engine.state.migration;
public interface DbMigrator {
void runMigrations();
}
``` |
Ian Michael Smith (born May 5, 1987) is an American actor, known for his starring role in Simon Birch.
His short physical stature () is a result of Morquio syndrome, a rare enzymatic disorder affecting the circulatory, muscular and skeletal systems.
Life and work
Smith was born in Elmhurst, Illinois. A Chicago-area hospital worker approached his parents about him auditioning for the leading role in The Mighty, a feature film about a character with Morquio Syndrome. Kieran Culkin was cast instead, but Smith was recommended for the title role of Simon Birch (1998), a film based loosely on John Irving's novel A Prayer for Owen Meany, which also called for a small child actor.
He graduated from York Community High School in Elmhurst in 2005. He has undergone several operations including a spinal fusion and two bilateral osteotomies. He graduated from the Massachusetts Institute of Technology in 2009, and Gallaudet University in 2012 and now works as a software engineer. He is a co-founder of a nonprofit organization, Project Alloy, that gives assistance to underrepresented people in tech fields. In 2019, he joined a class action lawsuit against the City of Oakland for excluding people with disabilities from the city’s rent control program.
References
External links
Ian Michael Smith at Hollywood.com
1987 births
Male actors from Illinois
American male child actors
American male film actors
Living people
People from Elmhurst, Illinois
20th-century American male actors
Massachusetts Institute of Technology alumni
Gallaudet University alumni |
Francesco Di Bartolo (Catania, Sicily, 1826 – 1913) was an Italian engraver and painter.
He resided in Catania for most of his adult life, and became Director of the Civic Museum in his native city. He was honorary professor at the Institute of Fine Arts of Naples. He was prolific in engraved and acquaforte depictions. Among his acquaforte colored engravings are: Gli Iconoclasti of Morelli, the paintings of animals by Palizzi, and a series of portraits including of Count Cavour. Among his engravings with burin are those of a Madonna by Murillo and of the Madonna of the Harpies by Andrea del Sarto. He won various silver and gold prizes at exhibitions. He was awarded an honor in the Order of San Maurizio, and became an associate member of many Academies in Italy, as well as of the Academy in St Petersburg, Russia.
References
External links
Short Biography for an exhibition of this work
1826 births
1913 deaths
Artists from Catania
Kingdom of the Two Sicilies people
Italian engravers
19th-century Italian painters
Italian male painters
20th-century Italian painters
Artists from Sicily
19th-century Italian male artists
20th-century Italian male artists
20th-century engravers |
```smalltalk
using System;
using NUnit.Framework;
using UnityEngine;
namespace UnityEditor.ShaderGraph.UnitTests
{
[TestFixture]
public class PixelShaderNodeTests
{
/* private UnityEngine.MaterialGraph.MaterialGraph m_Graph;
private Vector1Node m_InputOne;
private AbsoluteNode m_Abs;
private MetallicMasterNode m_PixelNode;
[TestFixtureSetUp]
public void RunBeforeAnyTests()
{
Debug.unityLogger.logHandler = new ConsoleLogHandler();
}
[SetUp]
public void TestSetUp()
{
m_Graph = new UnityEngine.MaterialGraph.MaterialGraph();
m_PixelNode = new MetallicMasterNode();
m_InputOne = new Vector1Node();
m_Abs = new AbsoluteNode();
m_Graph.AddNode(m_PixelNode);
m_Graph.AddNode(m_InputOne);
m_Graph.AddNode(m_PixelNode);
m_Graph.AddNode(m_Abs);
m_InputOne.value = 0.2f;
m_Graph.Connect(m_InputOne.GetSlotReference(Vector1Node.OutputSlotId), m_PixelNode.GetSlotReference(AbstractSurfaceMasterNode.NormalSlotId));
// m_Graph.Connect(m_InputOne.GetSlotReference(Vector1Node.OutputSlotId), m_Abs.GetSlotReference(Function1Input.InputSlotId));
//m_Graph.Connect(m_Abs.GetSlotReference(Function1Input.OutputSlotId), m_PixelNode.GetSlotReference(AbstractSurfaceMasterNode.AlbedoSlotId));
}
[Test]
public void TestNodeGeneratesCorrectNodeCode()
{
string expected = string.Format("half {0} = 0.2;" + Environment.NewLine
+ "half {1} = abs ({0});" + Environment.NewLine
+ "o.Albedo = {1};" + Environment.NewLine
+ "o.Normal = {0};" + Environment.NewLine
, m_InputOne.GetVariableNameForSlot(Vector1Node.OutputSlotId)
, m_Abs.GetVariableNameForSlot(Function1Input.OutputSlotId));
var generator = new ShaderGenerator();
m_PixelNode.GenerateNodeCode(generator, GenerationMode.ForReals);
Console.WriteLine(generator.GetShaderString(0));
Assert.AreEqual(expected, generator.GetShaderString(0));
Assert.AreEqual(string.Empty, generator.GetPragmaString());
}*/
}
}
``` |
```xml
import * as React from 'react';
import { styled } from '@mui/material/styles';
import { IconButton } from '@mui/material';
import ActionHide from '@mui/icons-material/RemoveCircleOutline';
import clsx from 'clsx';
import { useResourceContext, useTranslate } from 'ra-core';
export const FilterFormInput = props => {
const { filterElement, handleHide, className } = props;
const resource = useResourceContext(props);
const translate = useTranslate();
return (
<Root
data-source={filterElement.props.source}
className={clsx('filter-field', className)}
>
{React.cloneElement(filterElement, {
resource,
record: emptyRecord,
size: filterElement.props.size ?? 'small',
helperText: false,
// ignore defaultValue in Field because it was already set in Form (via mergedInitialValuesWithDefaultValues)
defaultValue: undefined,
})}
{!filterElement.props.alwaysOn && (
<IconButton
className={clsx(
'hide-filter',
FilterFormInputClasses.hideButton
)}
onClick={handleHide}
data-key={filterElement.props.source}
title={translate('ra.action.remove_filter')}
size="small"
>
<ActionHide />
</IconButton>
)}
<div className={FilterFormInputClasses.spacer}> </div>
</Root>
);
};
const PREFIX = 'RaFilterFormInput';
export const FilterFormInputClasses = {
spacer: `${PREFIX}-spacer`,
hideButton: `${PREFIX}-hideButton`,
};
const Root = styled('div', {
name: PREFIX,
overridesResolver: (props, styles) => styles.root,
})(({ theme }) => ({
display: 'flex',
alignItems: 'flex-end',
pointerEvents: 'auto',
[theme.breakpoints.down('sm')]: {
width: '100%',
},
[`& .${FilterFormInputClasses.spacer}`]: { width: theme.spacing(2) },
[`& .${FilterFormInputClasses.hideButton}`]: {
marginBottom: theme.spacing(1),
},
}));
const emptyRecord = {};
``` |
```objective-c
//
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
//
#import <CoreData/CoreData.h>
#import <EarlGrey/GREYConfiguration.h>
#import <EarlGrey/GREYManagedObjectContextIdlingResource.h>
#import "Synchronization/GREYUIThreadExecutor+Internal.h"
#import "GREYBaseTest.h"
#import "GREYExposedForTesting.h"
static const NSTimeInterval kSemaphoreTimeoutSeconds = 0.1;
static const NSTimeInterval kExpectationTimeoutSeconds = 1.0;
@interface GREYManagedObjectContextIdlingResourceTest : GREYBaseTest
@end
@implementation GREYManagedObjectContextIdlingResourceTest
- (void)testIdleAfterInitializingAndDrainingOnBackgroundQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSPrivateQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testIdleAfterInitializingAndDrainingOnMainQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSMainQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testBusyAfterAddingAsyncBlockOnBackgroundQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSPrivateQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
XCTestExpectation *expecation = [self expectationWithDescription:@"Async block fired"];
dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);
[managedObjectContext performBlock:^{
dispatch_time_t timeout = dispatch_time(DISPATCH_TIME_NOW,
(int64_t)(kSemaphoreTimeoutSeconds * NSEC_PER_SEC));
dispatch_semaphore_wait(semaphore, timeout);
[expecation fulfill];
}];
XCTAssertFalse([managedObjectContextIdlingResource isIdleNow]);
dispatch_semaphore_signal(semaphore);
[self waitForExpectationsWithTimeout:kExpectationTimeoutSeconds handler:nil];
// Drain the queue in order to avoid the race condition where the expectation has been fulfilled
// by the async task and the main thread resumes before that async task completes.
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testBusyAfterAddingAsyncBlockOnMainQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSMainQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
XCTestExpectation *expecation = [self expectationWithDescription:@"Async block fired"];
[managedObjectContext performBlock:^{
[expecation fulfill];
}];
XCTAssertFalse([managedObjectContextIdlingResource isIdleNow]);
[self waitForExpectationsWithTimeout:kExpectationTimeoutSeconds handler:nil];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testIdleAfterSyncBlockOnBackgroundQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSPrivateQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
[managedObjectContext performBlockAndWait:^{}];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testIdleAfterSyncBlockOnMainQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSMainQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
[managedObjectContext performBlockAndWait:^{}];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testBusyAfterSyncMutationBlockOnBackgroundQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSPrivateQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
[managedObjectContext performBlockAndWait:^{
[self insertSimpleManagedObjectIntoContext:managedObjectContext];
}];
// Drain to make sure that we the the block didn't kick off another operation on the queue.
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertFalse([managedObjectContextIdlingResource isIdleNow],
@"Should be busy because of pending change.");
[managedObjectContext performBlockAndWait:^{
[managedObjectContext save:nil];
}];
// Need to drain since the save kicks off another async task.
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testBusyAfterSyncMutationBlockOnMainQueue {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSMainQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
[managedObjectContext performBlockAndWait:^{
[self insertSimpleManagedObjectIntoContext:managedObjectContext];
}];
// Drain to make sure that we the the block didn't kick off another operation on the queue.
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertFalse([managedObjectContextIdlingResource isIdleNow],
@"Should be busy because of pending change.");
[managedObjectContext performBlockAndWait:^{
[managedObjectContext save:nil];
}];
// Need to drain since the save kicks off another async task.
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)your_sha256_hashndingChanges {
NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSPrivateQueueConcurrencyType];
GREYManagedObjectContextIdlingResource *managedObjectContextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:NO];
[self drainDispatchQueue:[managedObjectContextIdlingResource managedObjectContextDispatchQueue]];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
[managedObjectContext performBlockAndWait:^{
[self insertSimpleManagedObjectIntoContext:managedObjectContext];
}];
XCTAssertTrue([managedObjectContextIdlingResource isIdleNow]);
}
- (void)testIdlingResourceWeaklyHoldsContextAndDeregistersItself {
GREYManagedObjectContextIdlingResource *contextIdlingResource;
GREYUIThreadExecutor *threadExecutor = [GREYUIThreadExecutor sharedInstance];
@autoreleasepool {
__autoreleasing NSManagedObjectContext *managedObjectContext =
[self setUpContextWithConcurrencyType:NSPrivateQueueConcurrencyType];
contextIdlingResource =
[self setUpContextIdlingResourceWithContext:managedObjectContext
trackingPendingChanges:YES];
[threadExecutor registerIdlingResource:contextIdlingResource];
XCTAssertTrue([threadExecutor grey_isTrackingIdlingResource:contextIdlingResource]);
}
XCTAssertTrue([contextIdlingResource isIdleNow]);
XCTAssertFalse([threadExecutor grey_isTrackingIdlingResource:contextIdlingResource]);
}
#pragma mark - Private Methods
- (NSManagedObjectContext *)
setUpContextWithConcurrencyType:(NSManagedObjectContextConcurrencyType)concurrencyType {
NSEntityDescription *entity = [[NSEntityDescription alloc] init];
[entity setName:@"EarlGreyCustomEntity"];
[entity setManagedObjectClassName:@"NSManagedObject"];
NSManagedObjectModel *managedObjectModel = [[NSManagedObjectModel alloc] init];
[managedObjectModel setEntities:@[entity]];
// In memory coordinator
NSPersistentStoreCoordinator *coordinator =
[[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:managedObjectModel];
[coordinator addPersistentStoreWithType:NSInMemoryStoreType
configuration:nil
URL:nil
options:nil
error:nil];
// Context with |concurrencyType|
NSManagedObjectContext *managedObjectContext =
[[NSManagedObjectContext alloc] initWithConcurrencyType:concurrencyType];
managedObjectContext.persistentStoreCoordinator = coordinator;
return managedObjectContext;
}
- (GREYManagedObjectContextIdlingResource *)
setUpContextIdlingResourceWithContext:(NSManagedObjectContext *)context
trackingPendingChanges:(BOOL)trackChanges {
NSString *resourceName = @"Test managed objectidling resource";
GREYManagedObjectContextIdlingResource *idlingResource =
[GREYManagedObjectContextIdlingResource resourceWithManagedObjectContext:context
trackPendingChanges:trackChanges
name:resourceName];
return idlingResource;
}
- (void)drainDispatchQueue:(dispatch_queue_t)dispatchQueue {
if (dispatchQueue == dispatch_get_main_queue()) {
// Call dispatch_sync on the main dispatch queue from the main thread will cause a deadlock,
// so we need to dispatch an async operation and wait on it.
XCTestExpectation *expectation = [self expectationWithDescription:@"Async block fired."];
dispatch_async(dispatchQueue, ^{
[expectation fulfill];
});
[self waitForExpectationsWithTimeout:1.0 handler:nil];
} else {
// Drain the private queue by dispatching a synchronous task.
dispatch_sync(dispatchQueue, ^{});
}
}
- (void)insertSimpleManagedObjectIntoContext:(NSManagedObjectContext *)managedObjectContext {
NSEntityDescription *entityDescription =
[NSEntityDescription entityForName:@"EarlGreyCustomEntity"
inManagedObjectContext:managedObjectContext];
// NSManagedObject::initWithEntity:insertIntoManagedObjectContext: initializes a managed object
// and inserts it into the context. To keep the compiler from complaining about not using the
// initialized object, we hold on to the reference with an unused variable.
__unused NSManagedObject *managedObject =
[[NSManagedObject alloc] initWithEntity:entityDescription
insertIntoManagedObjectContext:managedObjectContext];
}
@end
``` |
The Social Development Network (SDN), formerly of Social Development Unit (SDU) and Social Development Service (SDS), is a governmental body under the Ministry of Social and Family Development of Singapore. It works closely with the community and commercial sectors to foster opportunities for singles to interact in social settings in Singapore. Besides coordinating and facilitating dating activities offered by the private sector, it also serves to educate the public on singlehood issues. Other responsibilities of the SDN include providing the necessary infrastructure and support for the dating industry, as well as to ensure the professionalism of dating agencies through an accreditation council that was formed in 2007.
SDN's initial role in the industry is to organise activities for its members to interact. In 2006, SDU changed focus to accredit and fund private matchmaking and dating services agencies and projects. This was done in the hopes of creating a more vibrant dating scene and of allowing singles to have more options to interact with others of different educational levels.
Initiatives
A census conducted in 1980 revealed that a large number of highly educated women were still unmarried, despite being above 40 years of age. It was also noted that there was an inverse relationship between a person's educational level and the number of children he/she had. Dr Tony Tan—then Minister of Finance and Trade and Industry—attributed this finding to 2 factors: firstly, the cultural attitude among local men, who preferred to marry women with lower educational qualifications than themselves; and secondly, the preference among female university graduates to marry men who were better educated or at least of the same educational level as themselves. The SDU was thus formed in January 1984 to provide opportunities for single men and women to interact socially. Another objective of the unit was to encourage public discussion about the perceived problem of the large number of better-educated women remaining unmarried.
Since it was first established, SDU's target group was limited to university graduates. The government justified this elitist approach by announcing that they had identified graduates—and in particular the females among them—as a group which required assistance in terms of finding lifelong partners. According to the government, non-graduates did not seem to have any difficulty in finding partners. However, the Social Development Service (SDS) was set up a year after the SDU to promote marriages among non-graduates. On 28 January 2009, the SDU and SDS merged to become a single entity, tentatively named SDU-SDS, consolidating their respective resources and exposing their constituent members to the larger, merged database. It was renamed as Social Development Network or SDN on 16 October 2009.
Prior to the founding of the SDU, a Great Marriage Debate had been raging. During a speech made at the National Day rally in 1983, then-Prime Minister Lee Kuan Yew alleged that the phenomenon of graduates remaining single would result in a projected loss of about 400 talented people per year. This estimation was made on the basis that talent was not so much nurtured as it was conceived, as studies at that time had shown. Lee had also expressed worry that the dearth of children produced by graduate women would lead to the faltering of the economy and ultimately a decline in society. Although Lee had not explicitly stated that the SDU would be set up in response this problem, he had promised that tough measures would be taken by the government to curb the problem. The fact that the SDU was formed the following year has led many to perceive the debate to be the main reason behind the establishment of the unit and its exclusive focus.
Public response
Public reaction to the former SDU was initially that of disdain; graduate women were unhappy about their plight being addressed so prominently, while non-graduate women and their parents were upset at the government for dissuading graduate men from marrying them. In fact, there was an outcry by the public at large about the unfair use of taxpayers' money to subsidise leisure activities for graduates, especially since they already had a higher income. Some were also displeased about the fact that civil servants who were single were given three extra days of leave to go on SDU-organised cruises.
Besides those who treated the SDU with contempt, there were others who simply did not take the SDU seriously. One popular joke that was conceived in the early days was that 'SDU' also stood for 'Single, Desperate and Ugly'.
Results and success rates
Within the SDU
Results in the beginning were slow and the effectiveness of the SDU was questioned by both the public and members of parliament. From the time it was established in January 1984 to March 1985, only 2 marriages had taken place as a result of the SDU's efforts. This was not proportionate to the amount that SDU had spent on its activities, which came to $294,411.32 at the time. However, marriage figures began to increase from there. Between March and July 1985, another 4 couples were married. Since then, the number of SDU marriages has increased over the years. By the end of the first decade, the number of marriages as a result of SDU activities averaged about 1,000 a year. In 2003, SDU reported a significant increase in marriage figures for its members over the years, from 2,789 in 1999 to 4,050 in 2003. Over the first 2 decades since the SDU was first set up, SDU reported that more than 33,000 members were married (This includes Singaporeans who would have married irrespective of SDU).
On the national level
Despite the promising numbers reported by SDU, statistics at the national level do not mirror the trends within the SDU. According to the Singapore Department of Statistics, there was only a slight increase in marriages from 22,561 in year 2000 to 22,992 in year 2005. In fact, for those aged 30–34, the proportion remaining single had increased significantly between 2000 and 2005, as more people chose to delay marriage. In 2005, among citizens aged 30–34, the proportion that was single was 37 per cent for males and 26 per cent for females. As for those in the age range of 40–44, the proportion single at 2005 remained high at 15–17 per cent for males and females collectively.
The number of marriages has steadily increased over the years from 2004 (22,189) to 2009 (26,081) and crude marriage rate (per 1000 resident population) has been on the rise from 6.4 in 2007 to 6.6 in 2009. However, more singles are also marrying later. The mean age for first marriage has increased from 28.7 to 29.8 (males) and 26.2 to 27.5 (females) over the last 10 years.
Survey results
In addition to marriage and membership statistics, surveys are also conducted by the SDU from time to time to assess the effectiveness of the SDU, and its public education objective in particular. In mid 2006, 2,041 university undergraduates and polytechnic students were polled in an SDU survey on undergraduates' attitudes towards social interaction, dating and marriage. The data, which was collected during a series of face-to-face interviews, showed that, while nine out of ten would like to get married, only 35 per cent of them were certain about their intentions.
Earlier in the same year, SDU had conducted another survey which assessed attitudes of singles in general towards courtship and marriage. The survey, which included a variety of age groups, showed that 81 per cent of the 1,800 respondents indicated that they would like to get married. Yet, 74 per cent of the singles also mentioned that achieving success in work and studies was their top priority.
In the Survey on Singles' Attitudes towards Courtship and Marriage conducted by the Social Development Network (SDN) in 2009, about 80% of more than 1500 singles surveyed indicated their intention to get married. However, respondents also cited "not finding the right one or someone who met my expectations" as the top key reason for delaying marriage. Other top reasons cited included "not enough opportunities to socialize" and "lack of time" as career and other commitments were viewed as more/equally important, or as milestones to be achieved in life before they would think about marriage.
Activities and events
In the beginning, activities that the former SDU organised included personal-effectiveness workshops, computer courses, barbecues, dancing lessons, cruises/tours to the Maldives, Club Mediterrannee and Japan. SDU's policy then was that part of the fees for courses would be sponsored by the unit, while the full costs had to be borne by participants for cruises. However, it was discovered that the costs of cruises were significantly below the market rates and that participants were granted leave by their companies for the duration of the cruise.
Types of activities offered by SDU range from dating, wine & dine, self-enrichment, sports & recreation to travel and more. The range of activities offered has become much more extensive over the years. Self-enrichment activities consist mostly of dance lessons, though these span from Hip Hop to Exotic Dance to Ballroom Waltz. Other self-enrichment courses include wine appreciation, Pilates and baking classes. Overseas trips to many parts of Asia (such as China, Thailand, Malaysia, Vietnam and Nepal) are offered under the Travel category and the length of tours ranges from one day to 18 days, catering to different budgets and interests. SDU members are entitled to a subsidised rate for most activities, regardless of the nature of the event.
Since 2009, SDN has been working with different partners from the people, private and public sectors to create a broad array of opportunities for singles to meet and form meaningful relationships. Apart from collaborating with Government agencies under the Social Development Officers (SDO) Network to provide small interaction activities for single employees in the civil service, SDN also partners selected commercial and community organisations to facilitate larger-scale events for singles.
Membership
When SDU was first started, membership was free of charge and valid until the member registered his/her marriage, either at the Registry or under the Muslim law. Members paid only for the activities they registered for and even then, the costs were subsidised by the government. Many revisions have since been made to the various membership schemes, including an annual membership fee.
From October 2009, all membership schemes were removed and SDN extended its benefits to all resident singles without any fees. Currently, singles aged 20 and above who are Singapore Citizens and Permanent Residents can join the network's database to receive communications and information on dating, or register as a user to enjoy its online services, such as the chat function, forum discussion and personal ads, or to access information on events and other resources on dating and relationship.
Project Network
Project Network is an initiative that was started by the SDN in 2003 to provide funding for programmes helmed by local universities that promote social skills or provide socialisation opportunities for undergraduates. In order for projects to qualify for funding, they must satisfy one of the following two conditions:
Opportunities for interaction between the two genders are maximised
Undergraduates are taught social networking skills for personal development
The amount of funding that is provided depends on several factors, such as gender balance (the bare minimum being 60–40%), the originality of the programme, the degree of reach as well as the extent to which participants will benefit from the social interaction. As of 30 March 2007, the SDN had funded a total of about $275,000 worth of social activities.
Through Project Network, SDN has helped to co-fund orientation camps and student activities organised by the local universities to facilitate more gender-balanced social interaction opportunities on campus. Official student bodies may apply for the fund if they meet the criteria of the funding scheme, such as balanced gender ratio, meaningful interaction among opposite genders, etc.
References
External links
SDN website
SDN accredited agencies
Ministry of Social and Family Development website
1984 establishments in Singapore
Government agencies established in 1984
Organisations of the Singapore Government |
Pterostylis reflexa, commonly known as the dainty greenhood, is a species of orchid endemic to New South Wales. As with similar greenhoods, the flowering plants differ from those which are not flowering. The non-flowering plants have a rosette of leaves flat on the ground but the flowering plants have a single flower with leaves on the flowering stem. This greenhood has a relatively large white, green and light brown flower with a long, curved dorsal sepal and a protruding labellum.
Description
Pterostylis reflexa is a terrestrial, perennial, deciduous, herb with an underground tuber and when not flowering, a rosette of between three and seven egg-shaped leaves lying flat on the ground. Each leaf is long and wide. Flowering plants have a single sickle-shaped flower, long and wide on a flowering stem high with between three and five stem leaves. The flowers are white, green and light brown. The dorsal sepal and petals are fused, forming a hood or "galea" over the column, the dorsal sepal with a narrow tip long. The lateral sepals are in loose contact with the galea and have erect, thread-like tips long. There is a curved, V-shaped sinus between their bases. The labellum is long, about wide, reddish-brown and curved with about one-third protruding above the sinus. Flowering occurs from March to June.
Taxonomy and naming
Pterostylis reflexa was first formally described in 1810 by Robert Brown and the description was published in Prodromus Florae Novae Hollandiae et Insulae Van Diemen. The specific epithet (reflexa) is a Latin word meaning "bent or turned back."
Distribution and habitat
The dainty greenhood mainly grows on ridges and slopes in coastal and near-coastal forest between about Taree and Nowra.
References
External links
reflexa
Endemic orchids of Australia
Orchids of New South Wales
Plants described in 1810 |
```java
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing,
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* specific language governing permissions and limitations
*/
package io.ballerina.projects.internal;
/**
* {@code ModuleFileData} represents a Ballerina source file (.bal).
*
* @since 2.0.0
*/
public class DocumentData {
//TODO: Remove this class and use DocumentConfig for creating a document
private final String name;
private String content;
private DocumentData(String name, String content) {
this.name = name;
this.content = content;
}
public static DocumentData from(String name, String content) {
return new DocumentData(name, content);
}
public String content() {
return content;
}
public String name() {
return name;
}
}
``` |
```smalltalk
// The .NET Foundation licenses this file to you under the MIT license.
// See the LICENSE file in the project root for more information.
using System;
using System.Collections;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Collections.Specialized;
using System.ComponentModel;
using System.ComponentModel.DataAnnotations;
using System.Diagnostics;
using System.Security;
using System.Text;
using Microsoft.Toolkit.Uwp.UI.Automation.Peers;
using Microsoft.Toolkit.Uwp.UI.Controls.DataGridInternals;
using Microsoft.Toolkit.Uwp.UI.Controls.Primitives;
using Microsoft.Toolkit.Uwp.UI.Controls.Utilities;
using Microsoft.Toolkit.Uwp.UI.Data.Utilities;
using Microsoft.Toolkit.Uwp.UI.Utilities;
using Microsoft.Toolkit.Uwp.Utilities;
using Windows.ApplicationModel.DataTransfer;
using Windows.Devices.Input;
using Windows.Foundation;
using Windows.Foundation.Collections;
using Windows.System;
using Windows.UI.Input;
using Windows.UI.Xaml;
using Windows.UI.Xaml.Automation.Peers;
using Windows.UI.Xaml.Controls;
using Windows.UI.Xaml.Controls.Primitives;
using Windows.UI.Xaml.Data;
using Windows.UI.Xaml.Input;
using Windows.UI.Xaml.Media;
using Windows.UI.Xaml.Media.Animation;
using DiagnosticsDebug = System.Diagnostics.Debug;
namespace Microsoft.Toolkit.Uwp.UI.Controls
{
/// <summary>
/// Control to represent data in columns and rows.
/// </summary>
#if FEATURE_VALIDATION_SUMMARY
[TemplatePart(Name = DataGrid.DATAGRID_elementValidationSummary, Type = typeof(ValidationSummary))]
#endif
[TemplatePart(Name = DataGrid.DATAGRID_elementRowsPresenterName, Type = typeof(DataGridRowsPresenter))]
[TemplatePart(Name = DataGrid.DATAGRID_elementColumnHeadersPresenterName, Type = typeof(DataGridColumnHeadersPresenter))]
[TemplatePart(Name = DataGrid.DATAGRID_elementFrozenColumnScrollBarSpacerName, Type = typeof(FrameworkElement))]
[TemplatePart(Name = DataGrid.DATAGRID_elementHorizontalScrollBarName, Type = typeof(ScrollBar))]
[TemplatePart(Name = DataGrid.DATAGRID_elementVerticalScrollBarName, Type = typeof(ScrollBar))]
[TemplateVisualState(Name = VisualStates.StateDisabled, GroupName = VisualStates.GroupCommon)]
[TemplateVisualState(Name = VisualStates.StateNormal, GroupName = VisualStates.GroupCommon)]
[TemplateVisualState(Name = VisualStates.StateTouchIndicator, GroupName = VisualStates.GroupScrollBars)]
[TemplateVisualState(Name = VisualStates.StateMouseIndicator, GroupName = VisualStates.GroupScrollBars)]
[TemplateVisualState(Name = VisualStates.StateMouseIndicatorFull, GroupName = VisualStates.GroupScrollBars)]
[TemplateVisualState(Name = VisualStates.StateNoIndicator, GroupName = VisualStates.GroupScrollBars)]
[TemplateVisualState(Name = VisualStates.StateSeparatorExpanded, GroupName = VisualStates.GroupScrollBarsSeparator)]
[TemplateVisualState(Name = VisualStates.StateSeparatorCollapsed, GroupName = VisualStates.GroupScrollBarsSeparator)]
[TemplateVisualState(Name = VisualStates.StateSeparatorExpandedWithoutAnimation, GroupName = VisualStates.GroupScrollBarsSeparator)]
[TemplateVisualState(Name = VisualStates.StateSeparatorCollapsedWithoutAnimation, GroupName = VisualStates.GroupScrollBarsSeparator)]
[TemplateVisualState(Name = VisualStates.StateInvalid, GroupName = VisualStates.GroupValidation)]
[TemplateVisualState(Name = VisualStates.StateValid, GroupName = VisualStates.GroupValidation)]
[StyleTypedProperty(Property = "CellStyle", StyleTargetType = typeof(DataGridCell))]
[StyleTypedProperty(Property = "ColumnHeaderStyle", StyleTargetType = typeof(DataGridColumnHeader))]
[StyleTypedProperty(Property = "DragIndicatorStyle", StyleTargetType = typeof(ContentControl))]
[StyleTypedProperty(Property = "DropLocationIndicatorStyle", StyleTargetType = typeof(Control))]
[StyleTypedProperty(Property = "RowHeaderStyle", StyleTargetType = typeof(DataGridRowHeader))]
[StyleTypedProperty(Property = "RowStyle", StyleTargetType = typeof(DataGridRow))]
public partial class DataGrid : Control
{
private enum ScrollBarVisualState
{
NoIndicator,
TouchIndicator,
MouseIndicator,
MouseIndicatorFull
}
private enum ScrollBarsSeparatorVisualState
{
SeparatorCollapsed,
SeparatorExpanded,
SeparatorExpandedWithoutAnimation,
SeparatorCollapsedWithoutAnimation
}
#if FEATURE_VALIDATION_SUMMARY
private const string DATAGRID_elementValidationSummary = "ValidationSummary";
#endif
private const string DATAGRID_elementRootName = "Root";
private const string DATAGRID_elementRowsPresenterName = "RowsPresenter";
private const string DATAGRID_elementColumnHeadersPresenterName = "ColumnHeadersPresenter";
private const string DATAGRID_elementFrozenColumnScrollBarSpacerName = "FrozenColumnScrollBarSpacer";
private const string DATAGRID_elementHorizontalScrollBarName = "HorizontalScrollBar";
private const string DATAGRID_elementRowHeadersPresenterName = "RowHeadersPresenter";
private const string DATAGRID_elementTopLeftCornerHeaderName = "TopLeftCornerHeader";
private const string DATAGRID_elementTopRightCornerHeaderName = "TopRightCornerHeader";
private const string DATAGRID_elementBottomRightCornerHeaderName = "BottomRightCorner";
private const string DATAGRID_elementVerticalScrollBarName = "VerticalScrollBar";
private const bool DATAGRID_defaultAutoGenerateColumns = true;
private const bool DATAGRID_defaultCanUserReorderColumns = true;
private const bool DATAGRID_defaultCanUserResizeColumns = true;
private const bool DATAGRID_defaultCanUserSortColumns = true;
private const DataGridGridLinesVisibility DATAGRID_defaultGridLinesVisibility = DataGridGridLinesVisibility.None;
private const DataGridHeadersVisibility DATAGRID_defaultHeadersVisibility = DataGridHeadersVisibility.Column;
private const DataGridRowDetailsVisibilityMode DATAGRID_defaultRowDetailsVisibility = DataGridRowDetailsVisibilityMode.VisibleWhenSelected;
private const DataGridSelectionMode DATAGRID_defaultSelectionMode = DataGridSelectionMode.Extended;
private const ScrollBarVisibility DATAGRID_defaultScrollBarVisibility = ScrollBarVisibility.Auto;
/// <summary>
/// The default order to use for columns when there is no <see cref="DisplayAttribute.Order"/>
/// value available for the property.
/// </summary>
/// <remarks>
/// The value of 10,000 comes from the DataAnnotations spec, allowing
/// some properties to be ordered at the beginning and some at the end.
/// </remarks>
private const int DATAGRID_defaultColumnDisplayOrder = 10000;
private const double DATAGRID_horizontalGridLinesThickness = 1;
private const double DATAGRID_minimumRowHeaderWidth = 4;
private const double DATAGRID_minimumColumnHeaderHeight = 4;
internal const double DATAGRID_maximumStarColumnWidth = 10000;
internal const double DATAGRID_minimumStarColumnWidth = 0.001;
private const double DATAGRID_mouseWheelDeltaDivider = 4.0;
private const double DATAGRID_maxHeadersThickness = 32768;
private const double DATAGRID_defaultRowHeight = 22;
internal const double DATAGRID_defaultRowGroupSublevelIndent = 20;
private const double DATAGRID_defaultMinColumnWidth = 20;
private const double DATAGRID_defaultMaxColumnWidth = double.PositiveInfinity;
private const double DATAGRID_defaultIncrementalLoadingThreshold = 3.0;
private const double DATAGRID_defaultDataFetchSize = 3.0;
// 2 seconds delay used to hide the scroll bars for example when OS animations are turned off.
private const int DATAGRID_noScrollBarCountdownMs = 2000;
// Used to work around double arithmetic rounding.
private const double DATAGRID_roundingDelta = 0.0001;
// DataGrid Template Parts
#if FEATURE_VALIDATION_SUMMARY
private ValidationSummary _validationSummary;
#endif
private UIElement _bottomRightCorner;
private DataGridColumnHeadersPresenter _columnHeadersPresenter;
private ScrollBar _hScrollBar;
private DataGridRowsPresenter _rowsPresenter;
private ScrollBar _vScrollBar;
private byte _autoGeneratingColumnOperationCount;
private bool _autoSizingColumns;
private List<ValidationResult> _bindingValidationResults;
private ContentControl _clipboardContentControl;
private IndexToValueTable<Visibility> _collapsedSlotsTable;
private bool _columnHeaderHasFocus;
private DataGridCellCoordinates _currentCellCoordinates;
// used to store the current column during a Reset
private int _desiredCurrentColumnIndex;
private int _editingColumnIndex;
private RoutedEventArgs _editingEventArgs;
private bool _executingLostFocusActions;
private bool _flushCurrentCellChanged;
private bool _focusEditingControl;
private FocusInputDeviceKind _focusInputDevice;
private DependencyObject _focusedObject;
private DataGridRow _focusedRow;
private FrameworkElement _frozenColumnScrollBarSpacer;
private bool _hasNoIndicatorStateStoryboardCompletedHandler;
private DispatcherQueueTimer _hideScrollBarsTimer;
// the sum of the widths in pixels of the scrolling columns preceding
// the first displayed scrolling column
private double _horizontalOffset;
private byte _horizontalScrollChangesIgnored;
private bool _ignoreNextScrollBarsLayout;
private List<ValidationResult> _indeiValidationResults;
private bool _initializingNewItem;
private bool _isHorizontalScrollBarInteracting;
private bool _isVerticalScrollBarInteracting;
// Set to True when the pointer is over the optional scroll bars.
private bool _isPointerOverHorizontalScrollBar;
private bool _isPointerOverVerticalScrollBar;
// Set to True to prevent the normal fade-out of the scroll bars.
private bool _keepScrollBarsShowing;
// Nth row of rows 0..N that make up the RowHeightEstimate
private int _lastEstimatedRow;
private List<DataGridRow> _loadedRows;
// prevents reentry into the VerticalScroll event handler
private Queue<Action> _lostFocusActions;
private bool _makeFirstDisplayedCellCurrentCellPending;
private bool _measured;
// the number of pixels of the firstDisplayedScrollingCol which are not displayed
private double _negHorizontalOffset;
// the number of pixels of DisplayData.FirstDisplayedScrollingRow which are not displayed
private int _noCurrentCellChangeCount;
private int _noFocusedColumnChangeCount;
private int _noSelectionChangeCount;
private double _oldEdgedRowsHeightCalculated = 0.0;
// Set to True to favor mouse indicators over panning indicators for the scroll bars.
private bool _preferMouseIndicators;
private DataGridCellCoordinates _previousAutomationFocusCoordinates;
private DataGridColumn _previousCurrentColumn;
private object _previousCurrentItem;
private List<ValidationResult> _propertyValidationResults;
private ScrollBarVisualState _proposedScrollBarsState;
private ScrollBarsSeparatorVisualState _proposedScrollBarsSeparatorState;
private string _rowGroupHeaderPropertyNameAlternative;
private ObservableCollection<Style> _rowGroupHeaderStyles;
// To figure out what the old RowGroupHeaderStyle was for each level, we need to keep a copy
// of the list. The old style important so we don't blow away styles set directly on the RowGroupHeader
private List<Style> _rowGroupHeaderStylesOld;
private double[] _rowGroupHeightsByLevel;
private double _rowHeaderDesiredWidth;
private Size? _rowsPresenterAvailableSize;
private bool _scrollingByHeight;
private DataGridSelectedItemsCollection _selectedItems;
private IndexToValueTable<Visibility> _showDetailsTable;
// Set to True when the mouse scroll bars are currently showing.
private bool _showingMouseIndicators;
private bool _successfullyUpdatedSelection;
private bool _temporarilyResetCurrentCell;
private bool _isUserSorting; // True if we're currently in a user invoked sorting operation
private ContentControl _topLeftCornerHeader;
private ContentControl _topRightCornerHeader;
private object _uneditedValue; // Represents the original current cell value at the time it enters editing mode.
private string _updateSourcePath;
private Dictionary<INotifyDataErrorInfo, string> _validationItems;
private List<ValidationResult> _validationResults;
private byte _verticalScrollChangesIgnored;
#if FEATURE_ICOLLECTIONVIEW_GROUP
private INotifyCollectionChanged _topLevelGroup;
#else
private IObservableVector<object> _topLevelGroup;
#endif
#if FEATURE_VALIDATION_SUMMARY
private ValidationSummaryItem _selectedValidationSummaryItem;
#endif
// An approximation of the sum of the heights in pixels of the scrolling rows preceding
// the first displayed scrolling row. Since the scrolled off rows are discarded, the grid
// does not know their actual height. The heights used for the approximation are the ones
// set as the rows were scrolled off.
private double _verticalOffset;
#if FEATURE_ICOLLECTIONVIEW_GROUP
// Cache event listeners for PropertyChanged and CollectionChanged events from CollectionViewGroups
private Dictionary<INotifyPropertyChanged, WeakEventListener<DataGrid, object, PropertyChangedEventArgs>> _groupsPropertyChangedListenersTable = new Dictionary<INotifyPropertyChanged, WeakEventListener<DataGrid, object, PropertyChangedEventArgs>>();
private Dictionary<INotifyCollectionChanged, WeakEventListener<DataGrid, object, NotifyCollectionChangedEventArgs>> _groupsCollectionChangedListenersTable = new Dictionary<INotifyCollectionChanged, WeakEventListener<DataGrid, object, NotifyCollectionChangedEventArgs>>();
#else
// Cache event listeners for VectorChanged events from ICollectionViewGroup's GroupItems
private Dictionary<IObservableVector<object>, WeakEventListener<DataGrid, object, IVectorChangedEventArgs>> _groupsVectorChangedListenersTable = new Dictionary<IObservableVector<object>, WeakEventListener<DataGrid, object, IVectorChangedEventArgs>>();
#endif
/// <summary>
/// Occurs one time for each public, non-static property in the bound data type when the
/// <see cref="ItemsSource"/> property is changed and the
/// <see cref="AutoGenerateColumns"/> property is true.
/// </summary>
public event EventHandler<DataGridAutoGeneratingColumnEventArgs> AutoGeneratingColumn;
/// <summary>
/// Occurs before a cell or row enters editing mode.
/// </summary>
public event EventHandler<DataGridBeginningEditEventArgs> BeginningEdit;
/// <summary>
/// Occurs after cell editing has ended.
/// </summary>
public event EventHandler<DataGridCellEditEndedEventArgs> CellEditEnded;
/// <summary>
/// Occurs immediately before cell editing has ended.
/// </summary>
public event EventHandler<DataGridCellEditEndingEventArgs> CellEditEnding;
/// <summary>
/// Occurs when the <see cref="Microsoft.Toolkit.Uwp.UI.Controls.DataGridColumn.DisplayIndex"/>
/// property of a column changes.
/// </summary>
public event EventHandler<DataGridColumnEventArgs> ColumnDisplayIndexChanged;
/// <summary>
/// Occurs when the user drops a column header that was being dragged using the mouse.
/// </summary>
public event EventHandler<DragCompletedEventArgs> ColumnHeaderDragCompleted;
/// <summary>
/// Occurs one or more times while the user drags a column header using the mouse.
/// </summary>
public event EventHandler<DragDeltaEventArgs> ColumnHeaderDragDelta;
/// <summary>
/// Occurs when the user begins dragging a column header using the mouse.
/// </summary>
public event EventHandler<DragStartedEventArgs> ColumnHeaderDragStarted;
/// <summary>
/// Raised when column reordering ends, to allow subscribers to clean up.
/// </summary>
public event EventHandler<DataGridColumnEventArgs> ColumnReordered;
/// <summary>
/// Raised when starting a column reordering action. Subscribers to this event can
/// set tooltip and caret UIElements, constrain tooltip position, indicate that
/// a preview should be shown, or cancel reordering.
/// </summary>
public event EventHandler<DataGridColumnReorderingEventArgs> ColumnReordering;
/// <summary>
/// This event is raised by OnCopyingRowClipboardContent method after the default row content is prepared.
/// Event listeners can modify or add to the row clipboard content.
/// </summary>
public event EventHandler<DataGridRowClipboardEventArgs> CopyingRowClipboardContent;
/// <summary>
/// Occurs when a different cell becomes the current cell.
/// </summary>
public event EventHandler<EventArgs> CurrentCellChanged;
/// <summary>
/// Occurs after a <see cref="DataGridRow"/>
/// is instantiated, so that you can customize it before it is used.
/// </summary>
public event EventHandler<DataGridRowEventArgs> LoadingRow;
/// <summary>
/// Occurs when a new row details template is applied to a row, so that you can customize
/// the details section before it is used.
/// </summary>
public event EventHandler<DataGridRowDetailsEventArgs> LoadingRowDetails;
/// <summary>
/// Occurs before a DataGridRowGroupHeader header is used.
/// </summary>
public event EventHandler<DataGridRowGroupHeaderEventArgs> LoadingRowGroup;
/// <summary>
/// Occurs when a cell in a <see cref="DataGridTemplateColumn"/> enters editing mode.
/// </summary>
public event EventHandler<DataGridPreparingCellForEditEventArgs> PreparingCellForEdit;
/// <summary>
/// Occurs when the <see cref="RowDetailsVisibilityMode"/>
/// property value changes.
/// </summary>
public event EventHandler<DataGridRowDetailsEventArgs> RowDetailsVisibilityChanged;
/// <summary>
/// Occurs when the row has been successfully committed or canceled.
/// </summary>
public event EventHandler<DataGridRowEditEndedEventArgs> RowEditEnded;
/// <summary>
/// Occurs immediately before the row has been successfully committed or canceled.
/// </summary>
public event EventHandler<DataGridRowEditEndingEventArgs> RowEditEnding;
/// <summary>
/// Occurs when the <see cref="SelectedItem"/> or
/// <see cref="SelectedItems"/> property value changes.
/// </summary>
public event SelectionChangedEventHandler SelectionChanged;
/// <summary>
/// Occurs when the <see cref="Microsoft.Toolkit.Uwp.UI.Controls.DataGridColumn"/> sorting request is triggered.
/// </summary>
public event EventHandler<DataGridColumnEventArgs> Sorting;
/// <summary>
/// Occurs when a <see cref="DataGridRow"/>
/// object becomes available for reuse.
/// </summary>
public event EventHandler<DataGridRowEventArgs> UnloadingRow;
/// <summary>
/// Occurs when the DataGridRowGroupHeader is available for reuse.
/// </summary>
public event EventHandler<DataGridRowGroupHeaderEventArgs> UnloadingRowGroup;
/// <summary>
/// Occurs when a row details element becomes available for reuse.
/// </summary>
public event EventHandler<DataGridRowDetailsEventArgs> UnloadingRowDetails;
/// <summary>
/// Initializes a new instance of the <see cref="DataGrid"/> class.
/// </summary>
public DataGrid()
{
this.TabNavigation = KeyboardNavigationMode.Once;
_loadedRows = new List<DataGridRow>();
_lostFocusActions = new Queue<Action>();
_selectedItems = new DataGridSelectedItemsCollection(this);
_rowGroupHeaderPropertyNameAlternative = Controls.Resources.DefaultRowGroupHeaderPropertyNameAlternative;
_rowGroupHeaderStyles = new ObservableCollection<Style>();
_rowGroupHeaderStyles.CollectionChanged += RowGroupHeaderStyles_CollectionChanged;
_rowGroupHeaderStylesOld = new List<Style>();
this.RowGroupHeadersTable = new IndexToValueTable<DataGridRowGroupInfo>();
_collapsedSlotsTable = new IndexToValueTable<Visibility>();
_validationItems = new Dictionary<INotifyDataErrorInfo, string>();
_validationResults = new List<ValidationResult>();
_bindingValidationResults = new List<ValidationResult>();
_propertyValidationResults = new List<ValidationResult>();
_indeiValidationResults = new List<ValidationResult>();
this.ColumnHeaderInteractionInfo = new DataGridColumnHeaderInteractionInfo();
this.DisplayData = new DataGridDisplayData(this);
this.ColumnsInternal = CreateColumnsInstance();
this.RowHeightEstimate = DATAGRID_defaultRowHeight;
this.RowDetailsHeightEstimate = 0;
_rowHeaderDesiredWidth = 0;
this.DataConnection = new DataGridDataConnection(this);
_showDetailsTable = new IndexToValueTable<Visibility>();
_focusInputDevice = FocusInputDeviceKind.None;
_proposedScrollBarsState = ScrollBarVisualState.NoIndicator;
_proposedScrollBarsSeparatorState = ScrollBarsSeparatorVisualState.SeparatorCollapsed;
this.AnchorSlot = -1;
_lastEstimatedRow = -1;
_editingColumnIndex = -1;
this.CurrentCellCoordinates = new DataGridCellCoordinates(-1, -1);
this.RowGroupHeaderHeightEstimate = DATAGRID_defaultRowHeight;
this.LastHandledKeyDown = VirtualKey.None;
this.DefaultStyleKey = typeof(DataGrid);
HookDataGridEvents();
}
/// <summary>
/// Gets or sets the <see cref="T:System.Windows.Media.Brush"/> that is used to paint the background of odd-numbered rows.
/// </summary>
/// <returns>
/// The brush that is used to paint the background of odd-numbered rows.
/// </returns>
public Brush AlternatingRowBackground
{
get { return GetValue(AlternatingRowBackgroundProperty) as Brush; }
set { SetValue(AlternatingRowBackgroundProperty, value); }
}
/// <summary>
/// Identifies the <see cref="AlternatingRowBackground"/>
/// dependency property.
/// </summary>
/// <returns>
/// The identifier for the <see cref="AlternatingRowBackground"/>
/// dependency property.
/// </returns>
public static readonly DependencyProperty AlternatingRowBackgroundProperty =
DependencyProperty.Register(
"AlternatingRowBackground",
typeof(Brush),
typeof(DataGrid),
new PropertyMetadata(null, OnAlternatingRowBackgroundPropertyChanged));
private static void OnAlternatingRowBackgroundPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureBackground();
}
}
/// <summary>
/// Gets or sets the <see cref="T:System.Windows.Media.Brush"/> that is used to paint the foreground of odd-numbered rows.
/// </summary>
/// <returns>
/// The brush that is used to paint the foreground of odd-numbered rows.
/// </returns>
public Brush AlternatingRowForeground
{
get { return GetValue(AlternatingRowForegroundProperty) as Brush; }
set { SetValue(AlternatingRowForegroundProperty, value); }
}
/// <summary>
/// Identifies the <see cref="AlternatingRowForeground"/>
/// dependency property.
/// </summary>
/// <returns>
/// The identifier for the <see cref="AlternatingRowForeground"/>
/// dependency property.
/// </returns>
public static readonly DependencyProperty AlternatingRowForegroundProperty =
DependencyProperty.Register(
"AlternatingRowForeground",
typeof(Brush),
typeof(DataGrid),
new PropertyMetadata(null, OnAlternatingRowForegroundPropertyChanged));
private static void OnAlternatingRowForegroundPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureForeground();
}
}
/// <summary>
/// Gets or sets a value indicating whether the row details sections remain
/// fixed at the width of the display area or can scroll horizontally.
/// </summary>
public bool AreRowDetailsFrozen
{
get { return (bool)GetValue(AreRowDetailsFrozenProperty); }
set { SetValue(AreRowDetailsFrozenProperty, value); }
}
/// <summary>
/// Identifies the AreRowDetailsFrozen dependency property.
/// </summary>
public static readonly DependencyProperty AreRowDetailsFrozenProperty =
DependencyProperty.Register(
"AreRowDetailsFrozen",
typeof(bool),
typeof(DataGrid),
null);
/// <summary>
/// Gets or sets a value indicating whether the row group header sections
/// remain fixed at the width of the display area or can scroll horizontally.
/// </summary>
public bool AreRowGroupHeadersFrozen
{
get { return (bool)GetValue(AreRowGroupHeadersFrozenProperty); }
set { SetValue(AreRowGroupHeadersFrozenProperty, value); }
}
/// <summary>
/// Identifies the AreRowDetailsFrozen dependency property.
/// </summary>
public static readonly DependencyProperty AreRowGroupHeadersFrozenProperty =
DependencyProperty.Register(
"AreRowGroupHeadersFrozen",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(true, OnAreRowGroupHeadersFrozenPropertyChanged));
private static void OnAreRowGroupHeadersFrozenPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
ProcessFrozenColumnCount(dataGrid);
// Update elements in the RowGroupHeader that were previously frozen.
if ((bool)e.NewValue)
{
if (dataGrid._rowsPresenter != null)
{
foreach (UIElement element in dataGrid._rowsPresenter.Children)
{
DataGridRowGroupHeader groupHeader = element as DataGridRowGroupHeader;
if (groupHeader != null)
{
groupHeader.ClearFrozenStates();
}
}
}
}
}
/// <summary>
/// Gets or sets a value indicating whether columns are created
/// automatically when the <see cref="ItemsSource"/> property is set.
/// </summary>
public bool AutoGenerateColumns
{
get { return (bool)GetValue(AutoGenerateColumnsProperty); }
set { SetValue(AutoGenerateColumnsProperty, value); }
}
/// <summary>
/// Identifies the AutoGenerateColumns dependency property.
/// </summary>
public static readonly DependencyProperty AutoGenerateColumnsProperty =
DependencyProperty.Register(
"AutoGenerateColumns",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultAutoGenerateColumns, OnAutoGenerateColumnsPropertyChanged));
private static void OnAutoGenerateColumnsPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
bool value = (bool)e.NewValue;
if (value)
{
dataGrid.InitializeElements(false /*recycleRows*/);
}
else
{
dataGrid.RemoveAutoGeneratedColumns();
}
}
/// <summary>
/// Gets or sets a value indicating whether the user can change
/// the column display order by dragging column headers with the mouse.
/// </summary>
public bool CanUserReorderColumns
{
get { return (bool)GetValue(CanUserReorderColumnsProperty); }
set { SetValue(CanUserReorderColumnsProperty, value); }
}
/// <summary>
/// Identifies the CanUserReorderColumns dependency property.
/// </summary>
public static readonly DependencyProperty CanUserReorderColumnsProperty =
DependencyProperty.Register(
"CanUserReorderColumns",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultCanUserReorderColumns));
/// <summary>
/// Gets or sets a value indicating whether the user can adjust column widths using the mouse.
/// </summary>
public bool CanUserResizeColumns
{
get { return (bool)GetValue(CanUserResizeColumnsProperty); }
set { SetValue(CanUserResizeColumnsProperty, value); }
}
/// <summary>
/// Identifies the CanUserResizeColumns dependency property.
/// </summary>
public static readonly DependencyProperty CanUserResizeColumnsProperty =
DependencyProperty.Register(
"CanUserResizeColumns",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultCanUserResizeColumns, OnCanUserResizeColumnsPropertyChanged));
/// <summary>
/// CanUserResizeColumns property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its CanUserResizeColumns.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnCanUserResizeColumnsPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
dataGrid.EnsureHorizontalLayout();
}
/// <summary>
/// Gets or sets a value indicating whether the user can sort columns by clicking the column header.
/// </summary>
public bool CanUserSortColumns
{
get { return (bool)GetValue(CanUserSortColumnsProperty); }
set { SetValue(CanUserSortColumnsProperty, value); }
}
/// <summary>
/// Identifies the CanUserSortColumns dependency property.
/// </summary>
public static readonly DependencyProperty CanUserSortColumnsProperty =
DependencyProperty.Register(
"CanUserSortColumns",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultCanUserSortColumns));
/// <summary>
/// Gets or sets the style that is used when rendering the data grid cells.
/// </summary>
public Style CellStyle
{
get { return GetValue(CellStyleProperty) as Style; }
set { SetValue(CellStyleProperty, value); }
}
/// <summary>
/// Identifies the <see cref="CellStyle"/> dependency property.
/// </summary>
public static readonly DependencyProperty CellStyleProperty =
DependencyProperty.Register(
"CellStyle",
typeof(Style),
typeof(DataGrid),
new PropertyMetadata(null, OnCellStylePropertyChanged));
private static void OnCellStylePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (dataGrid != null)
{
Style previousStyle = e.OldValue as Style;
foreach (DataGridRow row in dataGrid.GetAllRows())
{
foreach (DataGridCell cell in row.Cells)
{
cell.EnsureStyle(previousStyle);
}
row.FillerCell.EnsureStyle(previousStyle);
}
dataGrid.InvalidateRowHeightEstimate();
}
}
/// <summary>
/// Gets or sets the property which determines how DataGrid content is copied to the Clipboard.
/// </summary>
public DataGridClipboardCopyMode ClipboardCopyMode
{
get { return (DataGridClipboardCopyMode)GetValue(ClipboardCopyModeProperty); }
set { SetValue(ClipboardCopyModeProperty, value); }
}
/// <summary>
/// Identifies the <see cref="ClipboardCopyMode"/> dependency property.
/// </summary>
public static readonly DependencyProperty ClipboardCopyModeProperty =
DependencyProperty.Register(
"ClipboardCopyMode",
typeof(DataGridClipboardCopyMode),
typeof(DataGrid),
new PropertyMetadata(DataGridClipboardCopyMode.ExcludeHeader));
/// <summary>
/// Gets or sets the height of the column headers row.
/// </summary>
public double ColumnHeaderHeight
{
get { return (double)GetValue(ColumnHeaderHeightProperty); }
set { SetValue(ColumnHeaderHeightProperty, value); }
}
/// <summary>
/// Identifies the ColumnHeaderHeight dependency property.
/// </summary>
public static readonly DependencyProperty ColumnHeaderHeightProperty =
DependencyProperty.Register(
"ColumnHeaderHeight",
typeof(double),
typeof(DataGrid),
new PropertyMetadata(double.NaN, OnColumnHeaderHeightPropertyChanged));
/// <summary>
/// ColumnHeaderHeightProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its ColumnHeaderHeight.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnColumnHeaderHeightPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double value = (double)e.NewValue;
if (value < DATAGRID_minimumColumnHeaderHeight)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "ColumnHeaderHeight", DATAGRID_minimumColumnHeaderHeight);
}
if (value > DATAGRID_maxHeadersThickness)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeLessThanOrEqualTo("value", "ColumnHeaderHeight", DATAGRID_maxHeadersThickness);
}
dataGrid.InvalidateMeasure();
}
}
/// <summary>
/// Gets or sets the style that is used when rendering the column headers.
/// </summary>
public Style ColumnHeaderStyle
{
get { return GetValue(ColumnHeaderStyleProperty) as Style; }
set { SetValue(ColumnHeaderStyleProperty, value); }
}
/// <summary>
/// Identifies the ColumnHeaderStyle dependency property.
/// </summary>
public static readonly DependencyProperty ColumnHeaderStyleProperty =
DependencyProperty.Register(
"ColumnHeaderStyle",
typeof(Style),
typeof(DataGrid),
new PropertyMetadata(null, OnColumnHeaderStylePropertyChanged));
private static void OnColumnHeaderStylePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
// TODO: ColumnHeaderStyle should be applied to the TopLeftCorner and the TopRightCorner as well
DataGrid dataGrid = d as DataGrid;
if (dataGrid != null)
{
Style previousStyle = e.OldValue as Style;
foreach (DataGridColumn column in dataGrid.Columns)
{
column.HeaderCell.EnsureStyle(previousStyle);
}
if (dataGrid.ColumnsInternal.FillerColumn != null)
{
dataGrid.ColumnsInternal.FillerColumn.HeaderCell.EnsureStyle(previousStyle);
}
}
}
/// <summary>
/// Gets or sets the standard width or automatic sizing mode of columns in the control.
/// </summary>
public DataGridLength ColumnWidth
{
get { return (DataGridLength)GetValue(ColumnWidthProperty); }
set { SetValue(ColumnWidthProperty, value); }
}
/// <summary>
/// Identifies the ColumnWidth dependency property.
/// </summary>
public static readonly DependencyProperty ColumnWidthProperty =
DependencyProperty.Register(
"ColumnWidth",
typeof(DataGridLength),
typeof(DataGrid),
new PropertyMetadata(DataGridLength.Auto, OnColumnWidthPropertyChanged));
/// <summary>
/// ColumnWidthProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its ColumnWidth.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnColumnWidthPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
foreach (DataGridColumn column in dataGrid.ColumnsInternal.GetDisplayedColumns())
{
if (column.InheritsWidth)
{
column.SetWidthInternalNoCallback(dataGrid.ColumnWidth);
}
}
dataGrid.EnsureHorizontalLayout();
}
/// <summary>
/// Gets or sets the amount of data to fetch for virtualizing/prefetch operations.
/// </summary>
/// <returns>
/// The amount of data to fetch per interval, in pages.
/// </returns>
public double DataFetchSize
{
get { return (double)GetValue(DataFetchSizeProperty); }
set { SetValue(DataFetchSizeProperty, value); }
}
/// <summary>
/// Identifies the <see cref="DataFetchSize"/> dependency property
/// </summary>
public static readonly DependencyProperty DataFetchSizeProperty =
DependencyProperty.Register(
nameof(DataFetchSize),
typeof(double),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultDataFetchSize, OnDataFetchSizePropertyChanged));
/// <summary>
/// DataFetchSizeProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its DataFetchSize.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnDataFetchSizePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double oldValue = (double)e.OldValue;
double newValue = (double)e.NewValue;
if (double.IsNaN(newValue))
{
dataGrid.SetValueNoCallback(e.Property, oldValue);
throw DataGridError.DataGrid.ValueCannotBeSetToNAN(nameof(dataGrid.DataFetchSize));
}
if (newValue < 0)
{
dataGrid.SetValueNoCallback(e.Property, oldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", nameof(dataGrid.DataFetchSize), 0);
}
}
}
/// <summary>
/// Gets or sets the style that is used when rendering the drag indicator
/// that is displayed while dragging column headers.
/// </summary>
public Style DragIndicatorStyle
{
get { return GetValue(DragIndicatorStyleProperty) as Style; }
set { SetValue(DragIndicatorStyleProperty, value); }
}
/// <summary>
/// Identifies the <see cref="DragIndicatorStyle"/>
/// dependency property.
/// </summary>
public static readonly DependencyProperty DragIndicatorStyleProperty =
DependencyProperty.Register(
"DragIndicatorStyle",
typeof(Style),
typeof(DataGrid),
null);
/// <summary>
/// Gets or sets the style that is used when rendering the column headers.
/// </summary>
public Style DropLocationIndicatorStyle
{
get { return GetValue(DropLocationIndicatorStyleProperty) as Style; }
set { SetValue(DropLocationIndicatorStyleProperty, value); }
}
/// <summary>
/// Identifies the <see cref="DropLocationIndicatorStyle"/>
/// dependency property.
/// </summary>
public static readonly DependencyProperty DropLocationIndicatorStyleProperty =
DependencyProperty.Register(
"DropLocationIndicatorStyle",
typeof(Style),
typeof(DataGrid),
null);
/// <summary>
/// Gets or sets the number of columns that the user cannot scroll horizontally.
/// </summary>
public int FrozenColumnCount
{
get { return (int)GetValue(FrozenColumnCountProperty); }
set { SetValue(FrozenColumnCountProperty, value); }
}
/// <summary>
/// Identifies the <see cref="FrozenColumnCount"/>
/// dependency property.
/// </summary>
public static readonly DependencyProperty FrozenColumnCountProperty =
DependencyProperty.Register(
"FrozenColumnCount",
typeof(int),
typeof(DataGrid),
new PropertyMetadata(0, OnFrozenColumnCountPropertyChanged));
private static void OnFrozenColumnCountPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
if ((int)e.NewValue < 0)
{
dataGrid.SetValueNoCallback(DataGrid.FrozenColumnCountProperty, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "FrozenColumnCount", 0);
}
ProcessFrozenColumnCount(dataGrid);
}
}
private static void ProcessFrozenColumnCount(DataGrid dataGrid)
{
dataGrid.CorrectColumnFrozenStates();
dataGrid.ComputeScrollBarsLayout();
dataGrid.InvalidateColumnHeadersArrange();
dataGrid.InvalidateCellsArrange();
}
/// <summary>
/// Gets or sets a value indicating which grid lines separating inner cells are shown.
/// </summary>
public DataGridGridLinesVisibility GridLinesVisibility
{
get { return (DataGridGridLinesVisibility)GetValue(GridLinesVisibilityProperty); }
set { SetValue(GridLinesVisibilityProperty, value); }
}
/// <summary>
/// Identifies the GridLinesVisibility dependency property.
/// </summary>
public static readonly DependencyProperty GridLinesVisibilityProperty =
DependencyProperty.Register(
"GridLinesVisibility",
typeof(DataGridGridLinesVisibility),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultGridLinesVisibility, OnGridLinesVisibilityPropertyChanged));
/// <summary>
/// GridLinesProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its GridLines.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnGridLinesVisibilityPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureGridLines();
row.InvalidateHorizontalArrange();
}
foreach (DataGridRowGroupHeader rowGroupHeader in dataGrid.GetAllRowGroupHeaders())
{
rowGroupHeader.EnsureGridLine();
}
}
/// <summary>
/// Gets or sets a value indicating the visibility of row and column headers.
/// </summary>
public DataGridHeadersVisibility HeadersVisibility
{
get { return (DataGridHeadersVisibility)GetValue(HeadersVisibilityProperty); }
set { SetValue(HeadersVisibilityProperty, value); }
}
/// <summary>
/// Identifies the HeadersVisibility dependency property.
/// </summary>
public static readonly DependencyProperty HeadersVisibilityProperty =
DependencyProperty.Register(
"HeadersVisibility",
typeof(DataGridHeadersVisibility),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultHeadersVisibility, OnHeadersVisibilityPropertyChanged));
/// <summary>
/// HeadersVisibilityProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its HeadersVisibility.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnHeadersVisibilityPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
DataGridHeadersVisibility newValue = (DataGridHeadersVisibility)e.NewValue;
DataGridHeadersVisibility oldValue = (DataGridHeadersVisibility)e.OldValue;
Func<DataGridHeadersVisibility, DataGridHeadersVisibility, bool> hasFlags = (DataGridHeadersVisibility value, DataGridHeadersVisibility flags) => ((value & flags) == flags);
bool newValueCols = hasFlags(newValue, DataGridHeadersVisibility.Column);
bool newValueRows = hasFlags(newValue, DataGridHeadersVisibility.Row);
bool oldValueCols = hasFlags(oldValue, DataGridHeadersVisibility.Column);
bool oldValueRows = hasFlags(oldValue, DataGridHeadersVisibility.Row);
// Columns
if (newValueCols != oldValueCols)
{
if (dataGrid._columnHeadersPresenter != null)
{
dataGrid.EnsureColumnHeadersVisibility();
if (!newValueCols)
{
dataGrid._columnHeadersPresenter.Measure(new Size(0.0, 0.0));
}
else
{
dataGrid.EnsureVerticalGridLines();
}
dataGrid.InvalidateMeasure();
}
}
// Rows
if (newValueRows != oldValueRows && dataGrid._rowsPresenter != null)
{
foreach (FrameworkElement element in dataGrid._rowsPresenter.Children)
{
DataGridRow row = element as DataGridRow;
if (row != null)
{
row.EnsureHeaderStyleAndVisibility(null);
if (newValueRows)
{
row.ApplyState(false /*animate*/);
row.EnsureHeaderVisibility();
}
}
else
{
DataGridRowGroupHeader rowGroupHeader = element as DataGridRowGroupHeader;
if (rowGroupHeader != null)
{
rowGroupHeader.EnsureHeaderStyleAndVisibility(null);
}
}
}
dataGrid.InvalidateRowHeightEstimate();
dataGrid.InvalidateRowsMeasure(true /*invalidateIndividualElements*/);
}
// TODO: This isn't necessary if the TopLeftCorner and the TopRightCorner Autosize to 0.
// See if their templates can be changed to do that.
if (dataGrid._topLeftCornerHeader != null)
{
dataGrid._topLeftCornerHeader.Visibility = (newValueRows && newValueCols) ? Visibility.Visible : Visibility.Collapsed;
if (dataGrid._topLeftCornerHeader.Visibility == Visibility.Collapsed)
{
dataGrid._topLeftCornerHeader.Measure(new Size(0.0, 0.0));
}
}
}
/// <summary>
/// Gets or sets the <see cref="T:System.Windows.Media.Brush"/> that is used to paint grid lines separating rows.
/// </summary>
public Brush HorizontalGridLinesBrush
{
get { return GetValue(HorizontalGridLinesBrushProperty) as Brush; }
set { SetValue(HorizontalGridLinesBrushProperty, value); }
}
/// <summary>
/// Identifies the HorizontalGridLinesBrush dependency property.
/// </summary>
public static readonly DependencyProperty HorizontalGridLinesBrushProperty =
DependencyProperty.Register(
"HorizontalGridLinesBrush",
typeof(Brush),
typeof(DataGrid),
new PropertyMetadata(null, OnHorizontalGridLinesBrushPropertyChanged));
/// <summary>
/// HorizontalGridLinesBrushProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its HorizontalGridLinesBrush.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnHorizontalGridLinesBrushPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property) && dataGrid._rowsPresenter != null)
{
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureGridLines();
}
foreach (DataGridRowGroupHeader rowGroupHeader in dataGrid.GetAllRowGroupHeaders())
{
rowGroupHeader.EnsureGridLine();
}
}
}
/// <summary>
/// Gets or sets a value indicating how the horizontal scroll bar is displayed.
/// </summary>
public ScrollBarVisibility HorizontalScrollBarVisibility
{
get { return (ScrollBarVisibility)GetValue(HorizontalScrollBarVisibilityProperty); }
set { SetValue(HorizontalScrollBarVisibilityProperty, value); }
}
/// <summary>
/// Identifies the HorizontalScrollBarVisibility dependency property.
/// </summary>
public static readonly DependencyProperty HorizontalScrollBarVisibilityProperty =
DependencyProperty.Register(
"HorizontalScrollBarVisibility",
typeof(ScrollBarVisibility),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultScrollBarVisibility, OnHorizontalScrollBarVisibilityPropertyChanged));
/// <summary>
/// HorizontalScrollBarVisibilityProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its HorizontalScrollBarVisibility.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnHorizontalScrollBarVisibilityPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property) && (ScrollBarVisibility)e.NewValue != (ScrollBarVisibility)e.OldValue)
{
dataGrid.UpdateRowsPresenterManipulationMode(true /*horizontalMode*/, false /*verticalMode*/);
if (dataGrid._hScrollBar != null)
{
if (dataGrid.IsHorizontalScrollBarOverCells)
{
dataGrid.ComputeScrollBarsLayout();
}
else
{
dataGrid.InvalidateMeasure();
}
}
}
}
/// <summary>
/// Gets or sets a value indicating whether the user can edit the values in the control.
/// </summary>
public bool IsReadOnly
{
get { return (bool)GetValue(IsReadOnlyProperty); }
set { SetValue(IsReadOnlyProperty, value); }
}
/// <summary>
/// Identifies the IsReadOnly dependency property.
/// </summary>
public static readonly DependencyProperty IsReadOnlyProperty =
DependencyProperty.Register(
"IsReadOnly",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(false, OnIsReadOnlyPropertyChanged));
/// <summary>
/// IsReadOnlyProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its IsReadOnly.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnIsReadOnlyPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
bool value = (bool)e.NewValue;
if (value && !dataGrid.CommitEdit(DataGridEditingUnit.Row, true /*exitEditing*/))
{
dataGrid.CancelEdit(DataGridEditingUnit.Row, false /*raiseEvents*/);
}
#if FEATURE_IEDITABLECOLLECTIONVIEW
dataGrid.UpdateNewItemPlaceholder();
#endif
}
}
/// <summary>
/// Gets a value indicating whether data in the grid is valid.
/// </summary>
public bool IsValid
{
get
{
return (bool)GetValue(IsValidProperty);
}
internal set
{
if (value != this.IsValid)
{
if (value)
{
VisualStates.GoToState(this, true, VisualStates.StateValid);
}
else
{
VisualStates.GoToState(this, true, VisualStates.StateInvalid, VisualStates.StateValid);
}
this.SetValueNoCallback(DataGrid.IsValidProperty, value);
}
}
}
/// <summary>
/// Identifies the IsValid dependency property.
/// </summary>
public static readonly DependencyProperty IsValidProperty =
DependencyProperty.Register(
"IsValid",
typeof(bool),
typeof(DataGrid),
new PropertyMetadata(true, OnIsValidPropertyChanged));
/// <summary>
/// IsValidProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its IsValid.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnIsValidPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
dataGrid.SetValueNoCallback(DataGrid.IsValidProperty, e.OldValue);
throw DataGridError.DataGrid.UnderlyingPropertyIsReadOnly("IsValid");
}
}
/// <summary>
/// Gets or sets the threshold range that governs when the DataGrid class will begin to prefetch more items.
/// </summary>
/// <returns>
/// The loading threshold, in terms of pages.
/// </returns>
public double IncrementalLoadingThreshold
{
get { return (double)GetValue(IncrementalLoadingThresholdProperty); }
set { SetValue(IncrementalLoadingThresholdProperty, value); }
}
/// <summary>
/// Identifies the <see cref="IncrementalLoadingThreshold"/> dependency property
/// </summary>
public static readonly DependencyProperty IncrementalLoadingThresholdProperty =
DependencyProperty.Register(
nameof(IncrementalLoadingThreshold),
typeof(double),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultIncrementalLoadingThreshold, OnIncrementalLoadingThresholdPropertyChanged));
/// <summary>
/// IncrementalLoadingThresholdProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its IncrementalLoadingThreshold.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnIncrementalLoadingThresholdPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double oldValue = (double)e.OldValue;
double newValue = (double)e.NewValue;
if (double.IsNaN(newValue))
{
dataGrid.SetValueNoCallback(e.Property, oldValue);
throw DataGridError.DataGrid.ValueCannotBeSetToNAN(nameof(dataGrid.IncrementalLoadingThreshold));
}
if (newValue < 0)
{
dataGrid.SetValueNoCallback(e.Property, oldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", nameof(dataGrid.IncrementalLoadingThreshold), 0);
}
if (newValue > oldValue)
{
dataGrid.LoadMoreDataFromIncrementalItemsSource();
}
}
}
/// <summary>
/// Gets or sets a value that indicates the conditions for prefetch operations by the DataGrid class.
/// </summary>
/// <returns>
/// An enumeration value that indicates the conditions that trigger prefetch operations. The default is **Edge**.
/// </returns>
public IncrementalLoadingTrigger IncrementalLoadingTrigger
{
get { return (IncrementalLoadingTrigger)GetValue(IncrementalLoadingTriggerProperty); }
set { SetValue(IncrementalLoadingTriggerProperty, value); }
}
/// <summary>
/// Identifies the <see cref="IncrementalLoadingTrigger"/> dependency property
/// </summary>
public static readonly DependencyProperty IncrementalLoadingTriggerProperty =
DependencyProperty.Register(
nameof(IncrementalLoadingTrigger),
typeof(IncrementalLoadingTrigger),
typeof(DataGrid),
new PropertyMetadata(IncrementalLoadingTrigger.Edge));
/// <summary>
/// Gets or sets a collection that is used to generate the content of the control.
/// </summary>
public IEnumerable ItemsSource
{
get { return GetValue(ItemsSourceProperty) as IEnumerable; }
set { SetValue(ItemsSourceProperty, value); }
}
/// <summary>
/// Identifies the <see cref="ItemsSource"/> dependency property.
/// </summary>
public static readonly DependencyProperty ItemsSourceProperty =
DependencyProperty.Register(
"ItemsSource",
typeof(IEnumerable),
typeof(DataGrid),
new PropertyMetadata(null, OnItemsSourcePropertyChanged));
/// <summary>
/// ItemsSourceProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its ItemsSource.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnItemsSourcePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
DiagnosticsDebug.Assert(dataGrid.DataConnection != null, "Expected non-null DataConnection.");
if (dataGrid.LoadingOrUnloadingRow)
{
dataGrid.SetValueNoCallback(DataGrid.ItemsSourceProperty, e.OldValue);
throw DataGridError.DataGrid.CannotChangeItemsWhenLoadingRows();
}
// Try to commit edit on the old DataSource, but force a cancel if it fails.
if (!dataGrid.CommitEdit())
{
dataGrid.CancelEdit(DataGridEditingUnit.Row, false);
}
dataGrid.DataConnection.UnWireEvents(dataGrid.DataConnection.DataSource);
dataGrid.DataConnection.ClearDataProperties();
dataGrid.ClearRowGroupHeadersTable();
// The old selected indexes are no longer relevant. There's a perf benefit from
// updating the selected indexes with a null DataSource, because we know that all
// of the previously selected indexes have been removed from selection.
dataGrid.DataConnection.DataSource = null;
dataGrid._selectedItems.UpdateIndexes();
dataGrid.CoerceSelectedItem();
// Wrap an IEnumerable in an ICollectionView if it's not already one.
bool setDefaultSelection = false;
IEnumerable newItemsSource = e.NewValue as IEnumerable;
if (newItemsSource != null && !(newItemsSource is ICollectionView))
{
dataGrid.DataConnection.DataSource = DataGridDataConnection.CreateView(newItemsSource);
}
else
{
dataGrid.DataConnection.DataSource = newItemsSource;
setDefaultSelection = true;
}
if (dataGrid.DataConnection.DataSource != null)
{
// Setup the column headers.
if (dataGrid.DataConnection.DataType != null)
{
foreach (DataGridBoundColumn boundColumn in dataGrid.ColumnsInternal.GetDisplayedColumns(column => column is DataGridBoundColumn))
{
boundColumn.SetHeaderFromBinding();
}
}
dataGrid.DataConnection.WireEvents(dataGrid.DataConnection.DataSource);
}
// Wait for the current cell to be set before we raise any SelectionChanged events.
dataGrid._makeFirstDisplayedCellCurrentCellPending = true;
// Clear out the old rows and remove the generated columns.
dataGrid.ClearRows(false /*recycle*/);
dataGrid.RemoveAutoGeneratedColumns();
// Set the SlotCount (from the data count and number of row group headers) before we make the default selection.
dataGrid.PopulateRowGroupHeadersTable();
dataGrid.RefreshSlotCounts();
dataGrid.SelectedItem = null;
if (dataGrid.DataConnection.CollectionView != null && setDefaultSelection)
{
dataGrid.SelectedItem = dataGrid.DataConnection.CollectionView.CurrentItem;
}
// Treat this like the DataGrid has never been measured because all calculations at
// this point are invalid until the next layout cycle. For instance, the ItemsSource
// can be set when the DataGrid is not part of the visual tree.
dataGrid._measured = false;
dataGrid.InvalidateMeasure();
}
}
/// <summary>
/// Gets or sets the maximum width of columns in the <see cref="DataGrid"/>.
/// </summary>
public double MaxColumnWidth
{
get { return (double)GetValue(MaxColumnWidthProperty); }
set { SetValue(MaxColumnWidthProperty, value); }
}
/// <summary>
/// Identifies the MaxColumnWidth dependency property.
/// </summary>
public static readonly DependencyProperty MaxColumnWidthProperty =
DependencyProperty.Register(
"MaxColumnWidth",
typeof(double),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultMaxColumnWidth, OnMaxColumnWidthPropertyChanged));
/// <summary>
/// MaxColumnWidthProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its ColumnWidth.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnMaxColumnWidthPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double oldValue = (double)e.OldValue;
double newValue = (double)e.NewValue;
if (double.IsNaN(newValue))
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueCannotBeSetToNAN("MaxColumnWidth");
}
if (newValue < 0)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "MaxColumnWidth", 0);
}
if (dataGrid.MinColumnWidth > newValue)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "MaxColumnWidth", "MinColumnWidth");
}
foreach (DataGridColumn column in dataGrid.ColumnsInternal.GetDisplayedColumns())
{
dataGrid.OnColumnMaxWidthChanged(column, Math.Min(column.MaxWidth, oldValue));
}
}
}
/// <summary>
/// Gets or sets the minimum width of columns in the <see cref="DataGrid"/>.
/// </summary>
public double MinColumnWidth
{
get { return (double)GetValue(MinColumnWidthProperty); }
set { SetValue(MinColumnWidthProperty, value); }
}
/// <summary>
/// Identifies the MinColumnWidth dependency property.
/// </summary>
public static readonly DependencyProperty MinColumnWidthProperty =
DependencyProperty.Register(
"MinColumnWidth",
typeof(double),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultMinColumnWidth, OnMinColumnWidthPropertyChanged));
/// <summary>
/// MinColumnWidthProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its ColumnWidth.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnMinColumnWidthPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double oldValue = (double)e.OldValue;
double newValue = (double)e.NewValue;
if (double.IsNaN(newValue))
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueCannotBeSetToNAN("MinColumnWidth");
}
if (newValue < 0)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "MinColumnWidth", 0);
}
if (double.IsPositiveInfinity(newValue))
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueCannotBeSetToInfinity("MinColumnWidth");
}
if (dataGrid.MaxColumnWidth < newValue)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeLessThanOrEqualTo("value", "MinColumnWidth", "MaxColumnWidth");
}
foreach (DataGridColumn column in dataGrid.ColumnsInternal.GetDisplayedColumns())
{
dataGrid.OnColumnMinWidthChanged(column, Math.Max(column.MinWidth, oldValue));
}
}
}
/// <summary>
/// Gets or sets the <see cref="T:System.Windows.Media.Brush"/> that is used to paint row backgrounds.
/// </summary>
public Brush RowBackground
{
get { return GetValue(RowBackgroundProperty) as Brush; }
set { SetValue(RowBackgroundProperty, value); }
}
/// <summary>
/// Identifies the <see cref="RowBackground"/> dependency property.
/// </summary>
public static readonly DependencyProperty RowBackgroundProperty =
DependencyProperty.Register(
"RowBackground",
typeof(Brush),
typeof(DataGrid),
new PropertyMetadata(null, OnRowBackgroundPropertyChanged));
private static void OnRowBackgroundPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
// Go through the Displayed rows and update the background
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureBackground();
}
}
/// <summary>
/// Gets or sets the template that is used to display the content of the details section of rows.
/// </summary>
public DataTemplate RowDetailsTemplate
{
get { return GetValue(RowDetailsTemplateProperty) as DataTemplate; }
set { SetValue(RowDetailsTemplateProperty, value); }
}
/// <summary>
/// Identifies the RowDetailsTemplate dependency property.
/// </summary>
public static readonly DependencyProperty RowDetailsTemplateProperty =
DependencyProperty.Register(
"RowDetailsTemplate",
typeof(DataTemplate),
typeof(DataGrid),
new PropertyMetadata(null, OnRowDetailsTemplatePropertyChanged));
private static void OnRowDetailsTemplatePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
// Update the RowDetails templates if necessary
if (dataGrid._rowsPresenter != null)
{
foreach (DataGridRow row in dataGrid.GetAllRows())
{
if (dataGrid.GetRowDetailsVisibility(row.Index) == Visibility.Visible)
{
// DetailsPreferredHeight is initialized when the DetailsElement's size changes.
row.ApplyDetailsTemplate(false /*initializeDetailsPreferredHeight*/);
}
}
}
dataGrid.UpdateRowDetailsHeightEstimate();
dataGrid.InvalidateMeasure();
}
/// <summary>
/// Gets or sets a value indicating when the details sections of rows are displayed.
/// </summary>
public DataGridRowDetailsVisibilityMode RowDetailsVisibilityMode
{
get { return (DataGridRowDetailsVisibilityMode)GetValue(RowDetailsVisibilityModeProperty); }
set { SetValue(RowDetailsVisibilityModeProperty, value); }
}
/// <summary>
/// Identifies the RowDetailsVisibilityMode dependency property.
/// </summary>
public static readonly DependencyProperty RowDetailsVisibilityModeProperty =
DependencyProperty.Register(
"RowDetailsVisibilityMode",
typeof(DataGridRowDetailsVisibilityMode),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultRowDetailsVisibility, OnRowDetailsVisibilityModePropertyChanged));
/// <summary>
/// RowDetailsVisibilityModeProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its RowDetailsVisibilityMode.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnRowDetailsVisibilityModePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
dataGrid.UpdateRowDetailsVisibilityMode((DataGridRowDetailsVisibilityMode)e.NewValue);
}
/// <summary>
/// Gets or sets the <see cref="T:System.Windows.Media.Brush"/> that is used as the default cells foreground.
/// </summary>
public Brush RowForeground
{
get { return GetValue(RowForegroundProperty) as Brush; }
set { SetValue(RowForegroundProperty, value); }
}
/// <summary>
/// Identifies the <see cref="RowForeground"/> dependency property.
/// </summary>
public static readonly DependencyProperty RowForegroundProperty =
DependencyProperty.Register(
"RowForeground",
typeof(Brush),
typeof(DataGrid),
new PropertyMetadata(null, OnRowForegroundPropertyChanged));
private static void OnRowForegroundPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
// Go through the Displayed rows and update the foreground
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureForeground();
}
}
/// <summary>
/// Gets or sets the standard height of rows in the control.
/// </summary>
public double RowHeight
{
get { return (double)GetValue(RowHeightProperty); }
set { SetValue(RowHeightProperty, value); }
}
/// <summary>
/// Identifies the RowHeight dependency property.
/// </summary>
public static readonly DependencyProperty RowHeightProperty =
DependencyProperty.Register(
"RowHeight",
typeof(double),
typeof(DataGrid),
new PropertyMetadata(double.NaN, OnRowHeightPropertyChanged));
/// <summary>
/// RowHeightProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its RowHeight.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnRowHeightPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double value = (double)e.NewValue;
if (value < DataGridRow.DATAGRIDROW_minimumHeight)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "RowHeight", 0);
}
if (value > DataGridRow.DATAGRIDROW_maximumHeight)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeLessThanOrEqualTo("value", "RowHeight", DataGridRow.DATAGRIDROW_maximumHeight);
}
dataGrid.InvalidateRowHeightEstimate();
// Re-measure all the rows due to the Height change
dataGrid.InvalidateRowsMeasure(true);
// DataGrid needs to update the layout information and the ScrollBars
dataGrid.InvalidateMeasure();
}
}
/// <summary>
/// Gets or sets the width of the row header column.
/// </summary>
public double RowHeaderWidth
{
get { return (double)GetValue(RowHeaderWidthProperty); }
set { SetValue(RowHeaderWidthProperty, value); }
}
/// <summary>
/// Identifies the RowHeaderWidth dependency property.
/// </summary>
public static readonly DependencyProperty RowHeaderWidthProperty =
DependencyProperty.Register(
"RowHeaderWidth",
typeof(double),
typeof(DataGrid),
new PropertyMetadata(double.NaN, OnRowHeaderWidthPropertyChanged));
/// <summary>
/// RowHeaderWidthProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its RowHeaderWidth.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnRowHeaderWidthPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
double value = (double)e.NewValue;
if (value < DATAGRID_minimumRowHeaderWidth)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeGreaterThanOrEqualTo("value", "RowHeaderWidth", DATAGRID_minimumRowHeaderWidth);
}
if (value > DATAGRID_maxHeadersThickness)
{
dataGrid.SetValueNoCallback(e.Property, e.OldValue);
throw DataGridError.DataGrid.ValueMustBeLessThanOrEqualTo("value", "RowHeaderWidth", DATAGRID_maxHeadersThickness);
}
dataGrid.EnsureRowHeaderWidth();
}
}
/// <summary>
/// Gets or sets the style that is used when rendering the row headers.
/// </summary>
public Style RowHeaderStyle
{
get { return GetValue(RowHeaderStyleProperty) as Style; }
set { SetValue(RowHeaderStyleProperty, value); }
}
/// <summary>
/// Identifies the <see cref="RowHeaderStyle"/> dependency property.
/// </summary>
public static readonly DependencyProperty RowHeaderStyleProperty =
DependencyProperty.Register(
"RowHeaderStyle",
typeof(Style),
typeof(DataGrid),
new PropertyMetadata(null, OnRowHeaderStylePropertyChanged));
private static void OnRowHeaderStylePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (dataGrid != null && dataGrid._rowsPresenter != null)
{
// Set HeaderStyle for displayed rows
Style previousStyle = e.OldValue as Style;
foreach (UIElement element in dataGrid._rowsPresenter.Children)
{
DataGridRow row = element as DataGridRow;
if (row != null)
{
row.EnsureHeaderStyleAndVisibility(previousStyle);
}
else
{
DataGridRowGroupHeader groupHeader = element as DataGridRowGroupHeader;
if (groupHeader != null)
{
groupHeader.EnsureHeaderStyleAndVisibility(previousStyle);
}
}
}
dataGrid.InvalidateRowHeightEstimate();
}
}
/// <summary>
/// Gets or sets the style that is used when rendering the rows.
/// </summary>
public Style RowStyle
{
get { return GetValue(RowStyleProperty) as Style; }
set { SetValue(RowStyleProperty, value); }
}
/// <summary>
/// Identifies the <see cref="RowStyle"/> dependency property.
/// </summary>
public static readonly DependencyProperty RowStyleProperty =
DependencyProperty.Register(
"RowStyle",
typeof(Style),
typeof(DataGrid),
new PropertyMetadata(null, OnRowStylePropertyChanged));
private static void OnRowStylePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (dataGrid != null)
{
if (dataGrid._rowsPresenter != null)
{
// Set the style for displayed rows if it has not already been set
foreach (DataGridRow row in dataGrid.GetAllRows())
{
EnsureElementStyle(row, e.OldValue as Style, e.NewValue as Style);
}
}
dataGrid.InvalidateRowHeightEstimate();
}
}
/// <summary>
/// Gets or sets the selection behavior of the data grid.
/// </summary>
public DataGridSelectionMode SelectionMode
{
get { return (DataGridSelectionMode)GetValue(SelectionModeProperty); }
set { SetValue(SelectionModeProperty, value); }
}
/// <summary>
/// Identifies the SelectionMode dependency property.
/// </summary>
public static readonly DependencyProperty SelectionModeProperty =
DependencyProperty.Register(
"SelectionMode",
typeof(DataGridSelectionMode),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultSelectionMode, OnSelectionModePropertyChanged));
/// <summary>
/// SelectionModeProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its SelectionMode.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnSelectionModePropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
dataGrid.ClearRowSelection(true /*resetAnchorSlot*/);
}
}
/// <summary>
/// Gets or sets the index of the current selection.
/// </summary>
/// <returns>The index of the current selection, or -1 if the selection is empty.</returns>
public int SelectedIndex
{
get { return (int)GetValue(SelectedIndexProperty); }
set { SetValue(SelectedIndexProperty, value); }
}
/// <summary>
/// Identifies the SelectedIndex dependency property.
/// </summary>
public static readonly DependencyProperty SelectedIndexProperty =
DependencyProperty.Register(
"SelectedIndex",
typeof(int),
typeof(DataGrid),
new PropertyMetadata(-1, OnSelectedIndexPropertyChanged));
/// <summary>
/// SelectedIndexProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its SelectedIndex.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnSelectedIndexPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
int index = (int)e.NewValue;
// GetDataItem returns null if index is >= Count, we do not check newValue
// against Count here to avoid enumerating through an Enumerable twice
// Setting SelectedItem coerces the finally value of the SelectedIndex
object newSelectedItem = (index < 0) ? null : dataGrid.DataConnection.GetDataItem(index);
dataGrid.SelectedItem = newSelectedItem;
if (dataGrid.SelectedItem != newSelectedItem)
{
d.SetValueNoCallback(e.Property, e.OldValue);
}
}
}
/// <summary>
/// Gets or sets the data item corresponding to the selected row.
/// </summary>
public object SelectedItem
{
get { return GetValue(SelectedItemProperty) as object; }
set { SetValue(SelectedItemProperty, value); }
}
/// <summary>
/// Identifies the SelectedItem dependency property.
/// </summary>
public static readonly DependencyProperty SelectedItemProperty =
DependencyProperty.Register(
"SelectedItem",
typeof(object),
typeof(DataGrid),
new PropertyMetadata(null, OnSelectedItemPropertyChanged));
/// <summary>
/// SelectedItemProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its SelectedItem.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnSelectedItemPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property))
{
int rowIndex = (e.NewValue == null) ? -1 : dataGrid.DataConnection.IndexOf(e.NewValue);
if (rowIndex == -1)
{
// If the Item is null or it's not found, clear the Selection
if (!dataGrid.CommitEdit(DataGridEditingUnit.Row, true /*exitEditing*/))
{
// Edited value couldn't be committed or aborted
d.SetValueNoCallback(e.Property, e.OldValue);
return;
}
// Clear all row selections
dataGrid.ClearRowSelection(true /*resetAnchorSlot*/);
}
else
{
int slot = dataGrid.SlotFromRowIndex(rowIndex);
if (slot != dataGrid.CurrentSlot)
{
if (!dataGrid.CommitEdit(DataGridEditingUnit.Row, true /*exitEditing*/))
{
// Edited value couldn't be committed or aborted
d.SetValueNoCallback(e.Property, e.OldValue);
return;
}
if (slot >= dataGrid.SlotCount || slot < -1)
{
if (dataGrid.DataConnection.CollectionView != null)
{
dataGrid.DataConnection.CollectionView.MoveCurrentToPosition(rowIndex);
}
}
}
int oldSelectedIndex = dataGrid.SelectedIndex;
if (oldSelectedIndex != rowIndex)
{
dataGrid.SetValueNoCallback(DataGrid.SelectedIndexProperty, rowIndex);
}
try
{
dataGrid._noSelectionChangeCount++;
int columnIndex = dataGrid.CurrentColumnIndex;
if (columnIndex == -1)
{
columnIndex = dataGrid.FirstDisplayedNonFillerColumnIndex;
}
if (dataGrid.IsSlotOutOfSelectionBounds(slot))
{
dataGrid.ClearRowSelection(slot /*slotException*/, true /*resetAnchorSlot*/);
return;
}
dataGrid.UpdateSelectionAndCurrency(columnIndex, slot, DataGridSelectionAction.SelectCurrent, false /*scrollIntoView*/);
}
finally
{
dataGrid.NoSelectionChangeCount--;
}
if (!dataGrid._successfullyUpdatedSelection)
{
dataGrid.SetValueNoCallback(DataGrid.SelectedIndexProperty, oldSelectedIndex);
d.SetValueNoCallback(e.Property, e.OldValue);
}
}
}
}
/// <summary>
/// Gets or sets the <see cref="T:System.Windows.Media.Brush"/> that is used to paint grid lines separating columns.
/// </summary>
public Brush VerticalGridLinesBrush
{
get { return GetValue(VerticalGridLinesBrushProperty) as Brush; }
set { SetValue(VerticalGridLinesBrushProperty, value); }
}
/// <summary>
/// Identifies the VerticalGridLinesBrush dependency property.
/// </summary>
public static readonly DependencyProperty VerticalGridLinesBrushProperty =
DependencyProperty.Register(
"VerticalGridLinesBrush",
typeof(Brush),
typeof(DataGrid),
new PropertyMetadata(null, OnVerticalGridLinesBrushPropertyChanged));
/// <summary>
/// VerticalGridLinesBrushProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its VerticalGridLinesBrush.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnVerticalGridLinesBrushPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (dataGrid._rowsPresenter != null)
{
foreach (DataGridRow row in dataGrid.GetAllRows())
{
row.EnsureGridLines();
}
}
}
/// <summary>
/// Gets or sets a value indicating how the vertical scroll bar is displayed.
/// </summary>
public ScrollBarVisibility VerticalScrollBarVisibility
{
get { return (ScrollBarVisibility)GetValue(VerticalScrollBarVisibilityProperty); }
set { SetValue(VerticalScrollBarVisibilityProperty, value); }
}
/// <summary>
/// Identifies the VerticalScrollBarVisibility dependency property.
/// </summary>
public static readonly DependencyProperty VerticalScrollBarVisibilityProperty =
DependencyProperty.Register(
"VerticalScrollBarVisibility",
typeof(ScrollBarVisibility),
typeof(DataGrid),
new PropertyMetadata(DATAGRID_defaultScrollBarVisibility, OnVerticalScrollBarVisibilityPropertyChanged));
/// <summary>
/// VerticalScrollBarVisibilityProperty property changed handler.
/// </summary>
/// <param name="d">DataGrid that changed its VerticalScrollBarVisibility.</param>
/// <param name="e">DependencyPropertyChangedEventArgs.</param>
private static void OnVerticalScrollBarVisibilityPropertyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
DataGrid dataGrid = d as DataGrid;
if (!dataGrid.IsHandlerSuspended(e.Property) && (ScrollBarVisibility)e.NewValue != (ScrollBarVisibility)e.OldValue)
{
dataGrid.UpdateRowsPresenterManipulationMode(false /*horizontalMode*/, true /*verticalMode*/);
if (dataGrid._vScrollBar != null)
{
if (dataGrid.IsVerticalScrollBarOverCells)
{
dataGrid.ComputeScrollBarsLayout();
}
else
{
dataGrid.InvalidateMeasure();
}
}
}
}
/// <summary>
/// Gets a collection that contains all the columns in the control.
/// </summary>
public ObservableCollection<DataGridColumn> Columns
{
get
{
// we use a backing field here because the field's type
// is a subclass of the property's
return this.ColumnsInternal;
}
}
/// <summary>
/// Gets or sets the column that contains the current cell.
/// </summary>
public DataGridColumn CurrentColumn
{
get
{
if (this.CurrentColumnIndex == -1)
{
return null;
}
DiagnosticsDebug.Assert(this.CurrentColumnIndex < this.ColumnsItemsInternal.Count, "Expected CurrentColumnIndex smaller than ColumnsItemsInternal.Count.");
return this.ColumnsItemsInternal[this.CurrentColumnIndex];
}
set
{
DataGridColumn dataGridColumn = value;
if (dataGridColumn == null)
{
throw DataGridError.DataGrid.ValueCannotBeSetToNull("value", "CurrentColumn");
}
if (this.CurrentColumn != dataGridColumn)
{
if (dataGridColumn.OwningGrid != this)
{
// Provided column does not belong to this DataGrid
throw DataGridError.DataGrid.ColumnNotInThisDataGrid();
}
if (dataGridColumn.Visibility == Visibility.Collapsed)
{
// CurrentColumn cannot be set to an invisible column
throw DataGridError.DataGrid.ColumnCannotBeCollapsed();
}
if (this.CurrentSlot == -1)
{
// There is no current row so the current column cannot be set
throw DataGridError.DataGrid.NoCurrentRow();
}
bool beginEdit = _editingColumnIndex != -1;
if (!EndCellEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, this.ContainsFocus /*keepFocus*/, true /*raiseEvents*/))
{
// Edited value couldn't be committed or aborted
return;
}
if (_noFocusedColumnChangeCount == 0)
{
this.ColumnHeaderHasFocus = false;
}
this.UpdateSelectionAndCurrency(dataGridColumn.Index, this.CurrentSlot, DataGridSelectionAction.None, false /*scrollIntoView*/);
DiagnosticsDebug.Assert(_successfullyUpdatedSelection, "Expected _successfullyUpdatedSelection is true.");
if (beginEdit &&
_editingColumnIndex == -1 &&
this.CurrentSlot != -1 &&
this.CurrentColumnIndex != -1 &&
this.CurrentColumnIndex == dataGridColumn.Index &&
dataGridColumn.OwningGrid == this &&
!GetColumnEffectiveReadOnlyState(dataGridColumn))
{
// Returning to editing mode since the grid was in that mode prior to the EndCellEdit call above.
BeginCellEdit(new RoutedEventArgs());
}
}
}
}
/// <summary>
/// Gets or sets the label to display in a DataGridRowGroupHeader when its PropertyName is not set.
/// </summary>
public string RowGroupHeaderPropertyNameAlternative
{
get
{
return _rowGroupHeaderPropertyNameAlternative;
}
set
{
_rowGroupHeaderPropertyNameAlternative = value;
}
}
/// <summary>
/// Gets the style that is used when rendering the row group header.
/// </summary>
public ObservableCollection<Style> RowGroupHeaderStyles
{
get
{
return _rowGroupHeaderStyles;
}
}
/// <summary>
/// Gets a list that contains the data items corresponding to the selected rows.
/// </summary>
public IList SelectedItems
{
get { return _selectedItems as IList; }
}
/// <summary>
/// Gets the data item bound to the row that contains the current cell.
/// </summary>
protected object CurrentItem
{
get
{
if (this.CurrentSlot == -1 ||
this.RowGroupHeadersTable.Contains(this.CurrentSlot) ||
this.ItemsSource /*this.DataConnection.DataSource*/ == null)
{
return null;
}
return this.DataConnection.GetDataItem(RowIndexFromSlot(this.CurrentSlot));
}
}
internal static double HorizontalGridLinesThickness
{
get
{
return DATAGRID_horizontalGridLinesThickness;
}
}
internal int AnchorSlot
{
get;
private set;
}
internal double ActualRowHeaderWidth
{
get
{
if (!this.AreRowHeadersVisible)
{
return 0;
}
else
{
return !double.IsNaN(this.RowHeaderWidth) ? this.RowHeaderWidth : this.RowHeadersDesiredWidth;
}
}
}
internal double ActualRowsPresenterHeight
{
get
{
if (_rowsPresenter != null)
{
return _rowsPresenter.ActualHeight;
}
return 0;
}
}
internal bool AllowsManipulation
{
get
{
return _rowsPresenter != null &&
(_rowsPresenter.ManipulationMode & (ManipulationModes.TranslateX | ManipulationModes.TranslateY)) != ManipulationModes.None;
}
}
internal bool AreColumnHeadersVisible
{
get
{
return (this.HeadersVisibility & DataGridHeadersVisibility.Column) == DataGridHeadersVisibility.Column;
}
}
internal bool AreRowHeadersVisible
{
get
{
return (this.HeadersVisibility & DataGridHeadersVisibility.Row) == DataGridHeadersVisibility.Row;
}
}
/// <summary>
/// Gets or sets a value indicating whether or not at least one auto-sizing column is waiting for all the rows
/// to be measured before its final width is determined.
/// </summary>
internal bool AutoSizingColumns
{
get
{
return _autoSizingColumns;
}
set
{
if (_autoSizingColumns && value == false && this.ColumnsInternal != null)
{
double adjustment = this.CellsWidth - this.ColumnsInternal.VisibleEdgedColumnsWidth;
this.AdjustColumnWidths(0, adjustment, false);
foreach (DataGridColumn column in this.ColumnsInternal.GetVisibleColumns())
{
column.IsInitialDesiredWidthDetermined = true;
}
this.ColumnsInternal.EnsureVisibleEdgedColumnsWidth();
this.ComputeScrollBarsLayout();
InvalidateColumnHeadersMeasure();
InvalidateRowsMeasure(true);
}
_autoSizingColumns = value;
}
}
internal double AvailableSlotElementRoom
{
get;
set;
}
// Height currently available for cells this value is smaller. This height is reduced by the existence of ColumnHeaders
// or a horizontal scrollbar. Layout is asynchronous so changes to the ColumnHeaders or the horizontal scrollbar are
// not reflected immediately.
internal double CellsHeight
{
get
{
return this.RowsPresenterAvailableSize.HasValue ? this.RowsPresenterAvailableSize.Value.Height : 0;
}
}
// Width currently available for cells this value is smaller. This width is reduced by the existence of RowHeaders
// or a vertical scrollbar. Layout is asynchronous so changes to the RowHeaders or the vertical scrollbar are
// not reflected immediately
internal double CellsWidth
{
get
{
double rowsWidth = double.PositiveInfinity;
if (this.RowsPresenterAvailableSize.HasValue)
{
rowsWidth = Math.Max(0, this.RowsPresenterAvailableSize.Value.Width - this.ActualRowHeaderWidth);
}
return double.IsPositiveInfinity(rowsWidth) ? this.ColumnsInternal.VisibleEdgedColumnsWidth : rowsWidth;
}
}
/// <summary>
/// Gets an empty content control that's used during the DataGrid's copy procedure
/// to determine the value of a ClipboardContentBinding for a particular column and item.
/// </summary>
internal ContentControl ClipboardContentControl
{
get
{
if (_clipboardContentControl == null)
{
_clipboardContentControl = new ContentControl();
}
return _clipboardContentControl;
}
}
internal bool ColumnHeaderHasFocus
{
get
{
return _columnHeaderHasFocus;
}
set
{
DiagnosticsDebug.Assert(!value || (this.ColumnHeaders != null && this.AreColumnHeadersVisible), "Expected value==False || (non-null ColumnHeaders and AreColumnHeadersVisible==True)");
if (_columnHeaderHasFocus != value)
{
_columnHeaderHasFocus = value;
if (this.CurrentColumn != null && this.IsSlotVisible(this.CurrentSlot))
{
UpdateCurrentState(this.DisplayData.GetDisplayedElement(this.CurrentSlot), this.CurrentColumnIndex, true /*applyCellState*/);
}
DataGridColumn oldFocusedColumn = this.FocusedColumn;
this.FocusedColumn = null;
if (_columnHeaderHasFocus)
{
this.FocusedColumn = this.CurrentColumn == null ? this.ColumnsInternal.FirstVisibleNonFillerColumn : this.CurrentColumn;
}
if (oldFocusedColumn != null && oldFocusedColumn.HasHeaderCell)
{
oldFocusedColumn.HeaderCell.ApplyState(true);
}
if (this.FocusedColumn != null && this.FocusedColumn.HasHeaderCell)
{
this.FocusedColumn.HeaderCell.ApplyState(true);
ScrollColumnIntoView(this.FocusedColumn.Index);
}
}
}
}
internal DataGridColumnHeaderInteractionInfo ColumnHeaderInteractionInfo
{
get;
set;
}
internal DataGridColumnHeadersPresenter ColumnHeaders
{
get
{
return _columnHeadersPresenter;
}
}
internal DataGridColumnCollection ColumnsInternal
{
get;
private set;
}
internal List<DataGridColumn> ColumnsItemsInternal
{
get
{
return this.ColumnsInternal.ItemsInternal;
}
}
internal bool ContainsFocus
{
get;
private set;
}
internal int CurrentColumnIndex
{
get
{
return this.CurrentCellCoordinates.ColumnIndex;
}
private set
{
this.CurrentCellCoordinates.ColumnIndex = value;
}
}
internal int CurrentSlot
{
get
{
return this.CurrentCellCoordinates.Slot;
}
private set
{
this.CurrentCellCoordinates.Slot = value;
}
}
internal DataGridDataConnection DataConnection
{
get;
private set;
}
internal DataGridDisplayData DisplayData
{
get;
private set;
}
internal int EditingColumnIndex
{
get
{
return _editingColumnIndex;
}
}
internal DataGridRow EditingRow
{
get;
private set;
}
internal double FirstDisplayedScrollingColumnHiddenWidth
{
get
{
return _negHorizontalOffset;
}
}
internal DataGridColumn FocusedColumn
{
get;
set;
}
internal bool HasColumnUserInteraction
{
get
{
return this.ColumnHeaderInteractionInfo.HasUserInteraction;
}
}
// When the RowsPresenter's width increases, the HorizontalOffset will be incorrect until
// the scrollbar's layout is recalculated, which doesn't occur until after the cells are measured.
// This property exists to account for this scenario, and avoid collapsing the incorrect cells.
internal double HorizontalAdjustment
{
get;
private set;
}
// the sum of the widths in pixels of the scrolling columns preceding
// the first displayed scrolling column
internal double HorizontalOffset
{
get
{
return _horizontalOffset;
}
set
{
if (value < 0)
{
value = 0;
}
double widthNotVisible = Math.Max(0, this.ColumnsInternal.VisibleEdgedColumnsWidth - this.CellsWidth);
if (value > widthNotVisible)
{
value = widthNotVisible;
}
if (value == _horizontalOffset)
{
return;
}
SetHorizontalOffset(value);
_horizontalOffset = value;
this.DisplayData.FirstDisplayedScrollingCol = ComputeFirstVisibleScrollingColumn();
// update the lastTotallyDisplayedScrollingCol
ComputeDisplayedColumns();
}
}
internal ScrollBar HorizontalScrollBar
{
get
{
return _hScrollBar;
}
}
internal bool LoadingOrUnloadingRow
{
get;
private set;
}
internal bool InDisplayIndexAdjustments
{
get;
set;
}
internal double NegVerticalOffset
{
get;
private set;
}
internal int NoCurrentCellChangeCount
{
get
{
return _noCurrentCellChangeCount;
}
set
{
DiagnosticsDebug.Assert(value >= 0, "Expected positive NoCurrentCellChangeCount.");
_noCurrentCellChangeCount = value;
if (value == 0)
{
FlushCurrentCellChanged();
}
}
}
internal double RowDetailsHeightEstimate
{
get;
private set;
}
internal double RowHeadersDesiredWidth
{
get
{
return _rowHeaderDesiredWidth;
}
set
{
// We only auto grow
if (_rowHeaderDesiredWidth < value)
{
double oldActualRowHeaderWidth = this.ActualRowHeaderWidth;
_rowHeaderDesiredWidth = value;
if (oldActualRowHeaderWidth != this.ActualRowHeaderWidth)
{
bool invalidated = EnsureRowHeaderWidth();
// If we didn't invalidate in Ensure and we have star columns, force the column widths to be recomputed here.
if (!invalidated && this.ColumnsInternal.VisibleStarColumnCount > 0)
{
this.ColumnsInternal.EnsureVisibleEdgedColumnsWidth();
InvalidateMeasure();
}
}
}
}
}
internal double RowGroupHeaderHeightEstimate
{
get;
private set;
}
internal IndexToValueTable<DataGridRowGroupInfo> RowGroupHeadersTable
{
get;
private set;
}
internal double[] RowGroupSublevelIndents
{
get;
private set;
}
internal double RowHeightEstimate
{
get;
private set;
}
internal Size? RowsPresenterAvailableSize
{
get
{
return _rowsPresenterAvailableSize;
}
set
{
if (_rowsPresenterAvailableSize.HasValue && value.HasValue && value.Value.Width > this.RowsPresenterAvailableSize.Value.Width)
{
// When the available cells width increases, the horizontal offset can be incorrect.
// Store away an adjustment to use during the CellsPresenter's measure, so that the
// ShouldDisplayCell method correctly determines if a cell will be in view.
//
// | h. offset | new available cells width |
// |-------------->|----------------------------------------->|
// __________________________________________________ |
// | | | | | |
// | column0 | column1 | column2 | column3 |<----->|
// | | | | | adj. |
double adjustment = (_horizontalOffset + value.Value.Width) - this.ColumnsInternal.VisibleEdgedColumnsWidth;
this.HorizontalAdjustment = Math.Min(this.HorizontalOffset, Math.Max(0, adjustment));
}
else
{
this.HorizontalAdjustment = 0;
}
bool loadMoreDataFromIncrementalItemsSource = _rowsPresenterAvailableSize.HasValue && value.HasValue && value.Value.Height > _rowsPresenterAvailableSize.Value.Height;
_rowsPresenterAvailableSize = value;
if (loadMoreDataFromIncrementalItemsSource)
{
LoadMoreDataFromIncrementalItemsSource();
}
}
}
// This flag indicates whether selection has actually changed during a selection operation,
// and exists to ensure that FlushSelectionChanged doesn't unnecessarily raise SelectionChanged.
internal bool SelectionHasChanged
{
get;
set;
}
internal int SlotCount
{
get;
private set;
}
internal bool UpdatedStateOnTapped
{
get;
set;
}
/// <summary>
/// Gets a value indicating whether or not to use star-sizing logic. If the DataGrid has infinite available space,
/// then star sizing doesn't make sense. In this case, all star columns grow to a predefined size of
/// 10,000 pixels in order to show the developer that star columns shouldn't be used.
/// </summary>
internal bool UsesStarSizing
{
get
{
if (this.ColumnsInternal != null)
{
return this.ColumnsInternal.VisibleStarColumnCount > 0 &&
(!this.RowsPresenterAvailableSize.HasValue || !double.IsPositiveInfinity(this.RowsPresenterAvailableSize.Value.Width));
}
return false;
}
}
internal double VerticalOffset
{
get
{
return _verticalOffset;
}
set
{
bool loadMoreDataFromIncrementalItemsSource = _verticalOffset < value;
_verticalOffset = value;
if (loadMoreDataFromIncrementalItemsSource)
{
LoadMoreDataFromIncrementalItemsSource();
}
}
}
internal ScrollBar VerticalScrollBar
{
get
{
return _vScrollBar;
}
}
internal int VisibleSlotCount
{
get;
set;
}
private bool AreAllScrollBarsCollapsed
{
get
{
return (_hScrollBar == null || _hScrollBar.Visibility == Visibility.Collapsed) &&
(_vScrollBar == null || _vScrollBar.Visibility == Visibility.Collapsed);
}
}
private bool AreBothScrollBarsVisible
{
get
{
return _hScrollBar != null && _hScrollBar.Visibility == Visibility.Visible &&
_vScrollBar != null && _vScrollBar.Visibility == Visibility.Visible;
}
}
private DataGridCellCoordinates CurrentCellCoordinates
{
get
{
return _currentCellCoordinates;
}
set
{
_currentCellCoordinates = value;
}
}
private int FirstDisplayedNonFillerColumnIndex
{
get
{
DataGridColumn column = this.ColumnsInternal.FirstVisibleNonFillerColumn;
if (column != null)
{
if (column.IsFrozen)
{
return column.Index;
}
else
{
if (this.DisplayData.FirstDisplayedScrollingCol >= column.Index)
{
return this.DisplayData.FirstDisplayedScrollingCol;
}
else
{
return column.Index;
}
}
}
return -1;
}
}
private bool IsHorizontalScrollBarInteracting
{
get
{
return _isHorizontalScrollBarInteracting;
}
set
{
if (_isHorizontalScrollBarInteracting != value)
{
_isHorizontalScrollBarInteracting = value;
if (_hScrollBar != null)
{
if (_isHorizontalScrollBarInteracting)
{
// Prevent the vertical scroll bar from fading out while the user is interacting with the horizontal one.
_keepScrollBarsShowing = true;
ShowScrollBars();
}
else
{
// Make the scroll bars fade out, after the normal delay.
_keepScrollBarsShowing = false;
HideScrollBars(true /*useTransitions*/);
}
}
}
}
}
private bool IsHorizontalScrollBarOverCells
{
get
{
return _columnHeadersPresenter != null && Grid.GetColumnSpan(_columnHeadersPresenter) == 2;
}
}
private bool IsVerticalScrollBarInteracting
{
get
{
return _isVerticalScrollBarInteracting;
}
set
{
if (_isVerticalScrollBarInteracting != value)
{
_isVerticalScrollBarInteracting = value;
if (_vScrollBar != null)
{
if (_isVerticalScrollBarInteracting)
{
// Prevent the horizontal scroll bar from fading out while the user is interacting with the vertical one.
_keepScrollBarsShowing = true;
ShowScrollBars();
}
else
{
// Make the scroll bars fade out, after the normal delay.
_keepScrollBarsShowing = false;
HideScrollBars(true /*useTransitions*/);
}
}
}
}
}
private bool IsVerticalScrollBarOverCells
{
get
{
return _rowsPresenter != null && Grid.GetRowSpan(_rowsPresenter) == 2;
}
}
private VirtualKey LastHandledKeyDown
{
get;
set;
}
private int NoSelectionChangeCount
{
get
{
return _noSelectionChangeCount;
}
set
{
DiagnosticsDebug.Assert(value >= 0, "Expected positive NoSelectionChangeCount.");
_noSelectionChangeCount = value;
if (value == 0)
{
FlushSelectionChanged();
}
}
}
/// <summary>
/// Enters editing mode for the current cell and current row (if they're not already in editing mode).
/// </summary>
/// <returns>True if operation was successful. False otherwise.</returns>
public bool BeginEdit()
{
return BeginEdit(null);
}
/// <summary>
/// Enters editing mode for the current cell and current row (if they're not already in editing mode).
/// </summary>
/// <param name="editingEventArgs">Provides information about the user gesture that caused the call to BeginEdit. Can be null.</param>
/// <returns>True if operation was successful. False otherwise.</returns>
public bool BeginEdit(RoutedEventArgs editingEventArgs)
{
if (this.CurrentColumnIndex == -1 || !GetRowSelection(this.CurrentSlot))
{
return false;
}
DiagnosticsDebug.Assert(this.CurrentColumnIndex >= 0, "Expected positive CurrentColumnIndex.");
DiagnosticsDebug.Assert(this.CurrentColumnIndex < this.ColumnsItemsInternal.Count, "Expected CurrentColumnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.CurrentSlot >= -1, "Expected CurrentSlot greater than or equal to -1.");
DiagnosticsDebug.Assert(this.CurrentSlot < this.SlotCount, "Expected CurrentSlot smaller than SlotCount.");
DiagnosticsDebug.Assert(this.EditingRow == null || this.EditingRow.Slot == this.CurrentSlot, "Expected null EditingRow or EditingRow.Slot equal to CurrentSlot.");
if (GetColumnEffectiveReadOnlyState(this.CurrentColumn))
{
// Current column is read-only
return false;
}
return BeginCellEdit(editingEventArgs);
}
/// <summary>
/// Cancels editing mode and restores the original value.
/// </summary>
/// <returns>True if operation was successful. False otherwise.</returns>
public bool CancelEdit()
{
return CancelEdit(DataGridEditingUnit.Row);
}
/// <summary>
/// Cancels editing mode for the specified DataGridEditingUnit and restores its original value.
/// </summary>
/// <param name="editingUnit">Specifies whether to cancel edit for a Cell or Row.</param>
/// <returns>True if operation was successful. False otherwise.</returns>
public bool CancelEdit(DataGridEditingUnit editingUnit)
{
return this.CancelEdit(editingUnit, true /*raiseEvents*/);
}
/// <summary>
/// Commits editing mode and pushes changes to the backend.
/// </summary>
/// <returns>True if operation was successful. False otherwise.</returns>
public bool CommitEdit()
{
return CommitEdit(DataGridEditingUnit.Row, true);
}
/// <summary>
/// Commits editing mode for the specified DataGridEditingUnit and pushes changes to the backend.
/// </summary>
/// <param name="editingUnit">Specifies whether to commit edit for a Cell or Row.</param>
/// <param name="exitEditingMode">Editing mode is left if True.</param>
/// <returns>True if operation was successful. False otherwise.</returns>
public bool CommitEdit(DataGridEditingUnit editingUnit, bool exitEditingMode)
{
if (!EndCellEdit(DataGridEditAction.Commit, editingUnit == DataGridEditingUnit.Cell ? exitEditingMode : true, this.ContainsFocus /*keepFocus*/, true /*raiseEvents*/))
{
return false;
}
if (editingUnit == DataGridEditingUnit.Row)
{
return EndRowEdit(DataGridEditAction.Commit, exitEditingMode, true /*raiseEvents*/);
}
return true;
}
/// <summary>
/// Returns the Group at the indicated level or null if the item is not in the ItemsSource
/// </summary>
/// <param name="item">item</param>
/// <param name="groupLevel">groupLevel</param>
/// <returns>The group the given item falls under or null if the item is not in the ItemsSource</returns>
public ICollectionViewGroup GetGroupFromItem(object item, int groupLevel)
{
int itemIndex = this.DataConnection.IndexOf(item);
if (itemIndex == -1)
{
return null;
}
int groupHeaderSlot = this.RowGroupHeadersTable.GetPreviousIndex(SlotFromRowIndex(itemIndex));
DataGridRowGroupInfo rowGroupInfo = this.RowGroupHeadersTable.GetValueAt(groupHeaderSlot);
while (rowGroupInfo != null && rowGroupInfo.Level != groupLevel)
{
groupHeaderSlot = this.RowGroupHeadersTable.GetPreviousIndex(rowGroupInfo.Slot);
rowGroupInfo = this.RowGroupHeadersTable.GetValueAt(groupHeaderSlot);
}
return rowGroupInfo == null ? null : rowGroupInfo.CollectionViewGroup;
}
/// <summary>
/// Scrolls the specified item or RowGroupHeader and/or column into view.
/// If item is not null: scrolls the row representing the item into view;
/// If column is not null: scrolls the column into view;
/// If both item and column are null, the method returns without scrolling.
/// </summary>
/// <param name="item">an item from the DataGrid's items source or a CollectionViewGroup from the collection view</param>
/// <param name="column">a column from the DataGrid's columns collection</param>
public void ScrollIntoView(object item, DataGridColumn column)
{
if ((column == null && (item == null || this.FirstDisplayedNonFillerColumnIndex == -1)) ||
(column != null && column.OwningGrid != this))
{
// no-op
return;
}
if (item == null)
{
// scroll column into view
this.ScrollSlotIntoView(column.Index, this.DisplayData.FirstScrollingSlot, false /*forCurrentCellChange*/, true /*forceHorizontalScroll*/);
}
else
{
int slot;
DataGridRowGroupInfo rowGroupInfo = null;
ICollectionViewGroup collectionViewGroup = item as ICollectionViewGroup;
if (collectionViewGroup != null)
{
rowGroupInfo = RowGroupInfoFromCollectionViewGroup(collectionViewGroup);
if (rowGroupInfo == null)
{
Debug.Fail("Expected non-null rowGroupInfo.");
return;
}
slot = rowGroupInfo.Slot;
}
else
{
// the row index will be set to -1 if the item is null or not in the list
int rowIndex = this.DataConnection.IndexOf(item);
if (rowIndex == -1 || (this.IsReadOnly && rowIndex == this.DataConnection.NewItemPlaceholderIndex))
{
return;
}
slot = SlotFromRowIndex(rowIndex);
}
int columnIndex = (column == null) ? this.FirstDisplayedNonFillerColumnIndex : column.Index;
if (_collapsedSlotsTable.Contains(slot))
{
// We need to expand all parent RowGroups so that the slot is visible
if (rowGroupInfo != null)
{
ExpandRowGroupParentChain(rowGroupInfo.Level - 1, rowGroupInfo.Slot);
}
else
{
rowGroupInfo = this.RowGroupHeadersTable.GetValueAt(this.RowGroupHeadersTable.GetPreviousIndex(slot));
DiagnosticsDebug.Assert(rowGroupInfo != null, "Expected non-null rowGroupInfo.");
if (rowGroupInfo != null)
{
ExpandRowGroupParentChain(rowGroupInfo.Level, rowGroupInfo.Slot);
}
}
// Update ScrollBar and display information
this.NegVerticalOffset = 0;
SetVerticalOffset(0);
ResetDisplayedRows();
this.DisplayData.FirstScrollingSlot = 0;
ComputeScrollBarsLayout();
}
ScrollSlotIntoView(columnIndex, slot, true /*forCurrentCellChange*/, true /*forceHorizontalScroll*/);
}
}
/// <summary>
/// Arranges the content of the <see cref="DataGridRow"/>.
/// </summary>
/// <param name="finalSize">
/// The final area within the parent that this element should use to arrange itself and its children.
/// </param>
/// <returns>
/// The actual size used by the <see cref="DataGridRow"/>.
/// </returns>
protected override Size ArrangeOverride(Size finalSize)
{
if (_makeFirstDisplayedCellCurrentCellPending)
{
MakeFirstDisplayedCellCurrentCell();
}
if (this.ActualWidth != finalSize.Width)
{
// If our final width has changed, we might need to update the filler
InvalidateColumnHeadersArrange();
InvalidateCellsArrange();
}
return base.ArrangeOverride(finalSize);
}
/// <summary>
/// Measures the children of a <see cref="DataGridRow"/> to prepare for
/// arranging them during the
/// <see cref="M:System.Windows.Controls.DataGridRow.ArrangeOverride(System.Windows.Size)"/> pass.
/// </summary>
/// <returns>
/// The size that the <see cref="DataGridRow"/> determines it needs during layout, based on its calculations of child object allocated sizes.
/// </returns>
/// <param name="availableSize">
/// The available size that this element can give to child elements. Indicates an upper limit that
/// child elements should not exceed.
/// </param>
protected override Size MeasureOverride(Size availableSize)
{
// Delay layout until after the initial measure to avoid invalid calculations when the
// DataGrid is not part of the visual tree
if (!_measured)
{
_measured = true;
// We don't need to clear the rows because it was already done when the ItemsSource changed
RefreshRowsAndColumns(false /*clearRows*/);
// Update our estimates now that the DataGrid has all of the information necessary
UpdateRowDetailsHeightEstimate();
// Update frozen columns to account for columns added prior to loading or auto-generated columns
if (this.FrozenColumnCountWithFiller > 0)
{
ProcessFrozenColumnCount(this);
}
}
Size desiredSize;
// This is a shortcut to skip layout if we don't have any columns
if (this.ColumnsInternal.Count == 0)
{
if (_hScrollBar != null && _hScrollBar.Visibility != Visibility.Collapsed)
{
_hScrollBar.Visibility = Visibility.Collapsed;
}
if (_vScrollBar != null && _vScrollBar.Visibility != Visibility.Collapsed)
{
_vScrollBar.Visibility = Visibility.Collapsed;
}
desiredSize = base.MeasureOverride(availableSize);
}
else
{
if (_rowsPresenter != null)
{
_rowsPresenter.InvalidateMeasure();
}
InvalidateColumnHeadersMeasure();
desiredSize = base.MeasureOverride(availableSize);
ComputeScrollBarsLayout();
}
return desiredSize;
}
/// <summary>
/// Comparator class so we can sort list by the display index
/// </summary>
public class DisplayIndexComparer : IComparer<DataGridColumn>
{
// Calls CaseInsensitiveComparer.Compare with the parameters reversed.
int IComparer<DataGridColumn>.Compare(DataGridColumn x, DataGridColumn y)
{
return (x.DisplayIndexWithFiller < y.DisplayIndexWithFiller) ? -1 : 1;
}
}
/// <summary>
/// Builds the visual tree for the column header when a new template is applied.
/// </summary>
protected override void OnApplyTemplate()
{
// The template has changed, so we need to refresh the visuals
_measured = false;
_hasNoIndicatorStateStoryboardCompletedHandler = false;
_keepScrollBarsShowing = false;
if (_columnHeadersPresenter != null)
{
// If we're applying a new template, we want to remove the old column headers first
_columnHeadersPresenter.Children.Clear();
}
_columnHeadersPresenter = GetTemplateChild(DATAGRID_elementColumnHeadersPresenterName) as DataGridColumnHeadersPresenter;
if (_columnHeadersPresenter != null)
{
if (this.ColumnsInternal.FillerColumn != null)
{
this.ColumnsInternal.FillerColumn.IsRepresented = false;
}
_columnHeadersPresenter.OwningGrid = this;
// Columns were added before our Template was applied, add the ColumnHeaders now
List<DataGridColumn> sortedInternal = new List<DataGridColumn>(this.ColumnsItemsInternal);
sortedInternal.Sort(new DisplayIndexComparer());
foreach (DataGridColumn column in sortedInternal)
{
InsertDisplayedColumnHeader(column);
}
}
if (_rowsPresenter != null)
{
// If we're applying a new template, we want to remove the old rows first
this.UnloadElements(false /*recycle*/);
}
_rowsPresenter = GetTemplateChild(DATAGRID_elementRowsPresenterName) as DataGridRowsPresenter;
if (_rowsPresenter != null)
{
_rowsPresenter.OwningGrid = this;
InvalidateRowHeightEstimate();
UpdateRowDetailsHeightEstimate();
UpdateRowsPresenterManipulationMode(true /*horizontalMode*/, true /*verticalMode*/);
}
_frozenColumnScrollBarSpacer = GetTemplateChild(DATAGRID_elementFrozenColumnScrollBarSpacerName) as FrameworkElement;
if (_hScrollBar != null)
{
_isHorizontalScrollBarInteracting = false;
_isPointerOverHorizontalScrollBar = false;
UnhookHorizontalScrollBarEvents();
}
_hScrollBar = GetTemplateChild(DATAGRID_elementHorizontalScrollBarName) as ScrollBar;
if (_hScrollBar != null)
{
_hScrollBar.IsTabStop = false;
_hScrollBar.Maximum = 0.0;
_hScrollBar.Orientation = Orientation.Horizontal;
_hScrollBar.Visibility = Visibility.Collapsed;
HookHorizontalScrollBarEvents();
}
if (_vScrollBar != null)
{
_isVerticalScrollBarInteracting = false;
_isPointerOverVerticalScrollBar = false;
UnhookVerticalScrollBarEvents();
}
_vScrollBar = GetTemplateChild(DATAGRID_elementVerticalScrollBarName) as ScrollBar;
if (_vScrollBar != null)
{
_vScrollBar.IsTabStop = false;
_vScrollBar.Maximum = 0.0;
_vScrollBar.Orientation = Orientation.Vertical;
_vScrollBar.Visibility = Visibility.Collapsed;
HookVerticalScrollBarEvents();
}
_topLeftCornerHeader = GetTemplateChild(DATAGRID_elementTopLeftCornerHeaderName) as ContentControl;
EnsureTopLeftCornerHeader(); // EnsureTopLeftCornerHeader checks for a null _topLeftCornerHeader;
_topRightCornerHeader = GetTemplateChild(DATAGRID_elementTopRightCornerHeaderName) as ContentControl;
_bottomRightCorner = GetTemplateChild(DATAGRID_elementBottomRightCornerHeaderName) as UIElement;
#if FEATURE_VALIDATION_SUMMARY
if (_validationSummary != null)
{
_validationSummary.FocusingInvalidControl -= new EventHandler<FocusingInvalidControlEventArgs>(ValidationSummary_FocusingInvalidControl);
_validationSummary.SelectionChanged -= new EventHandler<SelectionChangedEventArgs>(ValidationSummary_SelectionChanged);
}
_validationSummary = GetTemplateChild(DATAGRID_elementValidationSummary) as ValidationSummary;
if (_validationSummary != null)
{
// The ValidationSummary defaults to using its parent if Target is null, so the only
// way to prevent it from automatically picking up errors is to set it to some useless element.
if (_validationSummary.Target == null)
{
_validationSummary.Target = new Rectangle();
}
_validationSummary.FocusingInvalidControl += new EventHandler<FocusingInvalidControlEventArgs>(ValidationSummary_FocusingInvalidControl);
_validationSummary.SelectionChanged += new EventHandler<SelectionChangedEventArgs>(ValidationSummary_SelectionChanged);
if (Windows.ApplicationModel.DesignMode.DesignModeEnabled)
{
DiagnosticsDebug.Assert(_validationSummary.Errors != null);
// Do not add the default design time errors when in design mode.
_validationSummary.Errors.Clear();
}
}
#endif
FrameworkElement root = GetTemplateChild(DATAGRID_elementRootName) as FrameworkElement;
if (root != null)
{
IList<VisualStateGroup> rootVisualStateGroups = VisualStateManager.GetVisualStateGroups(root);
if (rootVisualStateGroups != null)
{
int groupCount = rootVisualStateGroups.Count;
for (int groupIndex = 0; groupIndex < groupCount; groupIndex++)
{
VisualStateGroup group = rootVisualStateGroups[groupIndex];
if (group != null)
{
IList<VisualState> visualStates = group.States;
if (visualStates != null)
{
int stateCount = visualStates.Count;
for (int stateIndex = 0; stateIndex < stateCount; stateIndex++)
{
VisualState state = visualStates[stateIndex];
if (state != null)
{
string stateName = state.Name;
Storyboard stateStoryboard = state.Storyboard;
if (stateStoryboard != null)
{
if (stateName == VisualStates.StateNoIndicator)
{
stateStoryboard.Completed += NoIndicatorStateStoryboard_Completed;
_hasNoIndicatorStateStoryboardCompletedHandler = true;
}
else if (stateName == VisualStates.StateTouchIndicator || stateName == VisualStates.StateMouseIndicator || stateName == VisualStates.StateMouseIndicatorFull)
{
stateStoryboard.Completed += IndicatorStateStoryboard_Completed;
}
}
}
}
}
}
}
}
}
HideScrollBars(false /*useTransitions*/);
UpdateDisabledVisual();
}
/// <summary>
/// Raises the AutoGeneratingColumn event.
/// </summary>
protected virtual void OnAutoGeneratingColumn(DataGridAutoGeneratingColumnEventArgs e)
{
EventHandler<DataGridAutoGeneratingColumnEventArgs> handler = this.AutoGeneratingColumn;
if (handler != null)
{
handler(this, e);
}
}
/// <summary>
/// Raises the BeginningEdit event.
/// </summary>
protected virtual void OnBeginningEdit(DataGridBeginningEditEventArgs e)
{
EventHandler<DataGridBeginningEditEventArgs> handler = this.BeginningEdit;
if (handler != null)
{
handler(this, e);
}
}
/// <summary>
/// Raises the CellEditEnded event.
/// </summary>
protected virtual void OnCellEditEnded(DataGridCellEditEndedEventArgs e)
{
EventHandler<DataGridCellEditEndedEventArgs> handler = this.CellEditEnded;
if (handler != null)
{
handler(this, e);
}
// Raise the automation invoke event for the cell that just ended edit
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null && AutomationPeer.ListenerExists(AutomationEvents.InvokePatternOnInvoked))
{
peer.RaiseAutomationInvokeEvents(DataGridEditingUnit.Cell, e.Column, e.Row);
}
}
/// <summary>
/// Raises the CellEditEnding event.
/// </summary>
protected virtual void OnCellEditEnding(DataGridCellEditEndingEventArgs e)
{
EventHandler<DataGridCellEditEndingEventArgs> handler = this.CellEditEnding;
if (handler != null)
{
handler(this, e);
}
}
/// <summary>
/// This method raises the CopyingRowClipboardContent event.
/// </summary>
/// <param name="e">Contains the necessary information for generating the row clipboard content.</param>
protected virtual void OnCopyingRowClipboardContent(DataGridRowClipboardEventArgs e)
{
EventHandler<DataGridRowClipboardEventArgs> handler = this.CopyingRowClipboardContent;
if (handler != null)
{
handler(this, e);
}
}
/// <summary>
/// Creates AutomationPeer (<see cref="UIElement.OnCreateAutomationPeer"/>)
/// </summary>
/// <returns>An automation peer for this <see cref="DataGrid"/>.</returns>
protected override AutomationPeer OnCreateAutomationPeer()
{
return new DataGridAutomationPeer(this);
}
/// <summary>
/// Raises the CurrentCellChanged event.
/// </summary>
protected virtual void OnCurrentCellChanged(EventArgs e)
{
EventHandler<EventArgs> handler = this.CurrentCellChanged;
if (handler != null)
{
handler(this, e);
}
if (AutomationPeer.ListenerExists(AutomationEvents.SelectionItemPatternOnElementSelected))
{
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationCellSelectedEvent(this.CurrentSlot, this.CurrentColumnIndex);
}
}
}
/// <summary>
/// Raises the LoadingRow event for row preparation.
/// </summary>
protected virtual void OnLoadingRow(DataGridRowEventArgs e)
{
EventHandler<DataGridRowEventArgs> handler = this.LoadingRow;
if (handler != null)
{
DiagnosticsDebug.Assert(!_loadedRows.Contains(e.Row), "Expected e.Rows not contained in _loadedRows.");
_loadedRows.Add(e.Row);
this.LoadingOrUnloadingRow = true;
try
{
handler(this, e);
}
finally
{
this.LoadingOrUnloadingRow = false;
DiagnosticsDebug.Assert(_loadedRows.Contains(e.Row), "Expected e.Rows contained in _loadedRows.");
_loadedRows.Remove(e.Row);
}
}
}
/// <summary>
/// Raises the LoadingRowGroup event
/// </summary>
/// <param name="e">EventArgs</param>
protected virtual void OnLoadingRowGroup(DataGridRowGroupHeaderEventArgs e)
{
EventHandler<DataGridRowGroupHeaderEventArgs> handler = this.LoadingRowGroup;
if (handler != null)
{
this.LoadingOrUnloadingRow = true;
try
{
handler(this, e);
}
finally
{
this.LoadingOrUnloadingRow = false;
}
}
}
/// <summary>
/// Raises the LoadingRowDetails for row details preparation
/// </summary>
protected virtual void OnLoadingRowDetails(DataGridRowDetailsEventArgs e)
{
EventHandler<DataGridRowDetailsEventArgs> handler = this.LoadingRowDetails;
if (handler != null)
{
this.LoadingOrUnloadingRow = true;
try
{
handler(this, e);
}
finally
{
this.LoadingOrUnloadingRow = false;
}
}
}
/// <summary>
/// Scrolls the DataGrid according to the direction of the delta.
/// </summary>
/// <param name="e">PointerRoutedEventArgs</param>
protected override void OnPointerWheelChanged(PointerRoutedEventArgs e)
{
base.OnPointerWheelChanged(e);
if (!e.Handled)
{
PointerPoint pointerPoint = e.GetCurrentPoint(this);
// A horizontal scroll happens if the mouse has a horizontal wheel OR if the horizontal scrollbar is not disabled AND the vertical scrollbar IS disabled
bool isForHorizontalScroll = pointerPoint.Properties.IsHorizontalMouseWheel ||
(this.HorizontalScrollBarVisibility != ScrollBarVisibility.Disabled && this.VerticalScrollBarVisibility == ScrollBarVisibility.Disabled);
if ((isForHorizontalScroll && this.HorizontalScrollBarVisibility == ScrollBarVisibility.Disabled) ||
(!isForHorizontalScroll && this.VerticalScrollBarVisibility == ScrollBarVisibility.Disabled))
{
return;
}
double offsetDelta = -pointerPoint.Properties.MouseWheelDelta / DATAGRID_mouseWheelDeltaDivider;
if (isForHorizontalScroll && pointerPoint.Properties.IsHorizontalMouseWheel)
{
offsetDelta *= -1.0;
}
e.Handled = ProcessScrollOffsetDelta(offsetDelta, isForHorizontalScroll);
}
}
/// <summary>
/// Raises the PreparingCellForEdit event.
/// </summary>
protected virtual void OnPreparingCellForEdit(DataGridPreparingCellForEditEventArgs e)
{
EventHandler<DataGridPreparingCellForEditEventArgs> handler = this.PreparingCellForEdit;
if (handler != null)
{
handler(this, e);
}
// Raise the automation invoke event for the cell that just began edit because now
// its editable content has been loaded
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null && AutomationPeer.ListenerExists(AutomationEvents.InvokePatternOnInvoked))
{
peer.RaiseAutomationInvokeEvents(DataGridEditingUnit.Cell, e.Column, e.Row);
}
}
/// <summary>
/// Raises the RowEditEnded event.
/// </summary>
protected virtual void OnRowEditEnded(DataGridRowEditEndedEventArgs e)
{
EventHandler<DataGridRowEditEndedEventArgs> handler = this.RowEditEnded;
if (handler != null)
{
handler(this, e);
}
// Raise the automation invoke event for the row that just ended edit because the edits
// to its associated item have either been committed or reverted
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null && AutomationPeer.ListenerExists(AutomationEvents.InvokePatternOnInvoked))
{
peer.RaiseAutomationInvokeEvents(DataGridEditingUnit.Row, null, e.Row);
}
}
/// <summary>
/// Raises the RowEditEnding event.
/// </summary>
protected virtual void OnRowEditEnding(DataGridRowEditEndingEventArgs e)
{
EventHandler<DataGridRowEditEndingEventArgs> handler = this.RowEditEnding;
if (handler != null)
{
handler(this, e);
}
}
/// <summary>
/// Raises the SelectionChanged event and clears the _selectionChanged.
/// This event won't get raised again until after _selectionChanged is set back to true.
/// </summary>
protected virtual void OnSelectionChanged(SelectionChangedEventArgs e)
{
SelectionChangedEventHandler handler = this.SelectionChanged;
if (handler != null)
{
handler(this, e);
}
if (AutomationPeer.ListenerExists(AutomationEvents.SelectionItemPatternOnElementSelected) ||
AutomationPeer.ListenerExists(AutomationEvents.SelectionItemPatternOnElementAddedToSelection) ||
AutomationPeer.ListenerExists(AutomationEvents.SelectionItemPatternOnElementRemovedFromSelection))
{
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationSelectionEvents(e);
}
}
}
/// <summary>
/// Raises the UnloadingRow event for row recycling.
/// </summary>
protected virtual void OnUnloadingRow(DataGridRowEventArgs e)
{
EventHandler<DataGridRowEventArgs> handler = this.UnloadingRow;
if (handler != null)
{
this.LoadingOrUnloadingRow = true;
try
{
handler(this, e);
}
finally
{
this.LoadingOrUnloadingRow = false;
}
}
}
/// <summary>
/// Raises the UnloadingRowDetails event
/// </summary>
protected virtual void OnUnloadingRowDetails(DataGridRowDetailsEventArgs e)
{
EventHandler<DataGridRowDetailsEventArgs> handler = this.UnloadingRowDetails;
if (handler != null)
{
this.LoadingOrUnloadingRow = true;
try
{
handler(this, e);
}
finally
{
this.LoadingOrUnloadingRow = false;
}
}
}
/// <summary>
/// Raises the UnloadingRowGroup event
/// </summary>
/// <param name="e">EventArgs</param>
protected virtual void OnUnloadingRowGroup(DataGridRowGroupHeaderEventArgs e)
{
EventHandler<DataGridRowGroupHeaderEventArgs> handler = this.UnloadingRowGroup;
if (handler != null)
{
this.LoadingOrUnloadingRow = true;
try
{
handler(this, e);
}
finally
{
this.LoadingOrUnloadingRow = false;
}
}
}
internal static DataGridCell GetOwningCell(FrameworkElement element)
{
DiagnosticsDebug.Assert(element != null, "Expected non-null element.");
DataGridCell cell = element as DataGridCell;
while (element != null && cell == null)
{
element = element.Parent as FrameworkElement;
cell = element as DataGridCell;
}
return cell;
}
/// <summary>
/// Cancels editing mode for the specified DataGridEditingUnit and restores its original value.
/// </summary>
/// <param name="editingUnit">Specifies whether to cancel edit for a Cell or Row.</param>
/// <param name="raiseEvents">Specifies whether or not to raise editing events</param>
/// <returns>True if operation was successful. False otherwise.</returns>
internal bool CancelEdit(DataGridEditingUnit editingUnit, bool raiseEvents)
{
if (!EndCellEdit(DataGridEditAction.Cancel, true, this.ContainsFocus /*keepFocus*/, raiseEvents))
{
return false;
}
if (editingUnit == DataGridEditingUnit.Row)
{
return EndRowEdit(DataGridEditAction.Cancel, true, raiseEvents);
}
return true;
}
/// <summary>
/// call when: selection changes or SelectedItems object changes
/// </summary>
internal void CoerceSelectedItem()
{
object selectedItem = null;
if (this.SelectionMode == DataGridSelectionMode.Extended &&
this.CurrentSlot != -1 &&
_selectedItems.ContainsSlot(this.CurrentSlot))
{
selectedItem = this.CurrentItem;
}
else if (_selectedItems.Count > 0)
{
selectedItem = _selectedItems[0];
}
if (this.SelectedItem != selectedItem)
{
this.SetValueNoCallback(DataGrid.SelectedItemProperty, selectedItem);
}
// Update the SelectedIndex
int newIndex = -1;
if (selectedItem != null)
{
newIndex = this.DataConnection.IndexOf(selectedItem);
}
if (this.SelectedIndex != newIndex)
{
this.SetValueNoCallback(DataGrid.SelectedIndexProperty, newIndex);
}
}
internal IEnumerable<object> GetSelectionInclusive(int startRowIndex, int endRowIndex)
{
int endSlot = SlotFromRowIndex(endRowIndex);
foreach (int slot in _selectedItems.GetSlots(SlotFromRowIndex(startRowIndex)))
{
if (slot > endSlot)
{
break;
}
yield return this.DataConnection.GetDataItem(RowIndexFromSlot(slot));
}
}
internal void InitializeElements(bool recycleRows)
{
try
{
_noCurrentCellChangeCount++;
// The underlying collection has changed and our editing row (if there is one)
// is no longer relevant, so we should force a cancel edit.
CancelEdit(DataGridEditingUnit.Row, false /*raiseEvents*/);
// We want to persist selection throughout a reset, so store away the selected items
List<object> selectedItemsCache = new List<object>(_selectedItems.SelectedItemsCache);
if (recycleRows)
{
RefreshRows(recycleRows /*recycleRows*/, true /*clearRows*/);
}
else
{
RefreshRowsAndColumns(true /*clearRows*/);
}
// Re-select the old items
_selectedItems.SelectedItemsCache = selectedItemsCache;
CoerceSelectedItem();
if (this.RowDetailsVisibilityMode != DataGridRowDetailsVisibilityMode.Collapsed)
{
UpdateRowDetailsVisibilityMode(this.RowDetailsVisibilityMode);
}
// The currently displayed rows may have incorrect visual states because of the selection change
ApplyDisplayedRowsState(this.DisplayData.FirstScrollingSlot, this.DisplayData.LastScrollingSlot);
}
finally
{
this.NoCurrentCellChangeCount--;
}
}
// Returns the item or the CollectionViewGroup that is used as the DataContext for a given slot.
// If the DataContext is an item, rowIndex is set to the index of the item within the collection.
internal object ItemFromSlot(int slot, ref int rowIndex)
{
if (this.RowGroupHeadersTable.Contains(slot))
{
DataGridRowGroupInfo groupInfo = this.RowGroupHeadersTable.GetValueAt(slot);
if (groupInfo != null)
{
return groupInfo.CollectionViewGroup;
}
}
else
{
rowIndex = RowIndexFromSlot(slot);
return this.DataConnection.GetDataItem(rowIndex);
}
return null;
}
internal void LoadMoreDataFromIncrementalItemsSource()
{
LoadMoreDataFromIncrementalItemsSource(totalVisibleHeight: EdgedRowsHeightCalculated);
}
internal void OnRowDetailsChanged()
{
if (!_scrollingByHeight)
{
// Update layout when RowDetails are expanded or collapsed, just updating the vertical scroll bar is not enough
// since rows could be added or removed.
InvalidateMeasure();
}
}
internal void OnUserSorting()
{
_isUserSorting = true;
}
internal void OnUserSorted()
{
_isUserSorting = false;
}
internal bool ProcessDownKey()
{
bool shift, ctrl;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return ProcessDownKeyInternal(shift, ctrl);
}
internal bool ProcessEndKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessEndKey(shift, ctrl);
}
internal bool ProcessEnterKey()
{
bool ctrl, shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessEnterKey(shift, ctrl);
}
internal bool ProcessHomeKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessHomeKey(shift, ctrl);
}
internal void ProcessHorizontalScroll(ScrollEventType scrollEventType)
{
if (scrollEventType == ScrollEventType.EndScroll)
{
this.IsHorizontalScrollBarInteracting = false;
}
else if (scrollEventType == ScrollEventType.ThumbTrack)
{
this.IsHorizontalScrollBarInteracting = true;
}
if (_horizontalScrollChangesIgnored > 0)
{
return;
}
// If the user scrolls with the buttons, we need to update the new value of the scroll bar since we delay
// this calculation. If they scroll in another other way, the scroll bar's correct value has already been set
double scrollBarValueDifference = 0;
if (scrollEventType == ScrollEventType.SmallIncrement)
{
scrollBarValueDifference = GetHorizontalSmallScrollIncrease();
}
else if (scrollEventType == ScrollEventType.SmallDecrement)
{
scrollBarValueDifference = -GetHorizontalSmallScrollDecrease();
}
_horizontalScrollChangesIgnored++;
try
{
if (scrollBarValueDifference != 0)
{
DiagnosticsDebug.Assert(_horizontalOffset + scrollBarValueDifference >= 0, "Expected positive _horizontalOffset + scrollBarValueDifference.");
SetHorizontalOffset(_horizontalOffset + scrollBarValueDifference);
}
UpdateHorizontalOffset(_hScrollBar.Value);
}
finally
{
_horizontalScrollChangesIgnored--;
}
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationScrollEvents();
}
}
internal bool ProcessLeftKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessLeftKey(shift, ctrl);
}
internal bool ProcessNextKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessNextKey(shift, ctrl);
}
internal bool ProcessPriorKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessPriorKey(shift, ctrl);
}
internal bool ProcessRightKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessRightKey(shift, ctrl);
}
internal bool ProcessScrollOffsetDelta(double offsetDelta, bool isForHorizontalScroll)
{
if (this.IsEnabled && this.DisplayData.NumDisplayedScrollingElements > 0)
{
if (isForHorizontalScroll)
{
double newHorizontalOffset = this.HorizontalOffset + offsetDelta;
if (newHorizontalOffset < 0)
{
newHorizontalOffset = 0;
}
double maxHorizontalOffset = Math.Max(0, this.ColumnsInternal.VisibleEdgedColumnsWidth - this.CellsWidth);
if (newHorizontalOffset > maxHorizontalOffset)
{
newHorizontalOffset = maxHorizontalOffset;
}
if (newHorizontalOffset != this.HorizontalOffset)
{
UpdateHorizontalOffset(newHorizontalOffset);
return true;
}
}
else
{
if (offsetDelta < 0)
{
offsetDelta = Math.Max(-_verticalOffset, offsetDelta);
}
else if (offsetDelta > 0)
{
if (_vScrollBar != null && this.VerticalScrollBarVisibility == ScrollBarVisibility.Visible)
{
offsetDelta = Math.Min(Math.Max(0, _vScrollBar.Maximum - _verticalOffset), offsetDelta);
}
else
{
double maximum = this.EdgedRowsHeightCalculated - this.CellsHeight;
offsetDelta = Math.Min(Math.Max(0, maximum - _verticalOffset), offsetDelta);
}
}
if (offsetDelta != 0)
{
this.DisplayData.PendingVerticalScrollHeight = offsetDelta;
InvalidateRowsMeasure(false /*invalidateIndividualRows*/);
return true;
}
}
}
return false;
}
/// <summary>
/// Selects items and updates currency based on parameters
/// </summary>
/// <param name="columnIndex">column index to make current</param>
/// <param name="item">data item or CollectionViewGroup to make current</param>
/// <param name="backupSlot">slot to use in case the item is no longer valid</param>
/// <param name="action">selection action to perform</param>
/// <param name="scrollIntoView">whether or not the new current item should be scrolled into view</param>
internal void ProcessSelectionAndCurrency(int columnIndex, object item, int backupSlot, DataGridSelectionAction action, bool scrollIntoView)
{
_noSelectionChangeCount++;
_noCurrentCellChangeCount++;
try
{
int slot = -1;
ICollectionViewGroup group = item as ICollectionViewGroup;
if (group != null)
{
DataGridRowGroupInfo groupInfo = this.RowGroupInfoFromCollectionViewGroup(group);
if (groupInfo != null)
{
slot = groupInfo.Slot;
}
}
else
{
slot = this.SlotFromRowIndex(this.DataConnection.IndexOf(item));
}
if (slot == -1)
{
slot = backupSlot;
}
if (slot < 0 || slot > this.SlotCount)
{
return;
}
switch (action)
{
case DataGridSelectionAction.AddCurrentToSelection:
SetRowSelection(slot, true /*isSelected*/, true /*setAnchorIndex*/);
break;
case DataGridSelectionAction.RemoveCurrentFromSelection:
SetRowSelection(slot, false /*isSelected*/, false /*setAnchorRowIndex*/);
break;
case DataGridSelectionAction.SelectFromAnchorToCurrent:
if (this.SelectionMode == DataGridSelectionMode.Extended && this.AnchorSlot != -1)
{
int anchorSlot = this.AnchorSlot;
ClearRowSelection(slot /*slotException*/, false /*resetAnchorSlot*/);
if (slot <= anchorSlot)
{
SetRowsSelection(slot, anchorSlot);
}
else
{
SetRowsSelection(anchorSlot, slot);
}
}
else
{
goto case DataGridSelectionAction.SelectCurrent;
}
break;
case DataGridSelectionAction.SelectCurrent:
ClearRowSelection(slot /*rowIndexException*/, true /*setAnchorRowIndex*/);
break;
case DataGridSelectionAction.None:
break;
}
if (this.CurrentSlot != slot || (this.CurrentColumnIndex != columnIndex && columnIndex != -1))
{
if (columnIndex == -1)
{
if (this.CurrentColumnIndex != -1)
{
columnIndex = this.CurrentColumnIndex;
}
else
{
DataGridColumn firstVisibleColumn = this.ColumnsInternal.FirstVisibleNonFillerColumn;
if (firstVisibleColumn != null)
{
columnIndex = firstVisibleColumn.Index;
}
}
}
if (columnIndex != -1)
{
if (!SetCurrentCellCore(columnIndex, slot, true /*commitEdit*/, SlotFromRowIndex(this.SelectedIndex) != slot /*endRowEdit*/)
|| (scrollIntoView && !ScrollSlotIntoView(columnIndex, slot, true /*forCurrentCellChange*/, false /*forceHorizontalScroll*/)))
{
return;
}
}
}
_successfullyUpdatedSelection = true;
}
finally
{
this.NoCurrentCellChangeCount--;
this.NoSelectionChangeCount--;
}
}
internal bool ProcessUpKey()
{
bool ctrl;
bool shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessUpKey(shift, ctrl);
}
internal void ProcessVerticalScroll(ScrollEventType scrollEventType)
{
if (scrollEventType == ScrollEventType.EndScroll)
{
this.IsVerticalScrollBarInteracting = false;
}
else if (scrollEventType == ScrollEventType.ThumbTrack)
{
this.IsVerticalScrollBarInteracting = true;
}
if (_verticalScrollChangesIgnored > 0)
{
return;
}
DiagnosticsDebug.Assert(DoubleUtil.LessThanOrClose(_vScrollBar.Value, _vScrollBar.Maximum), "Expected _vScrollBar.Value smaller than or close to _vScrollBar.Maximum.");
_verticalScrollChangesIgnored++;
try
{
DiagnosticsDebug.Assert(_vScrollBar != null, "Expected non-null _vScrollBar.");
if (scrollEventType == ScrollEventType.SmallIncrement)
{
this.DisplayData.PendingVerticalScrollHeight = GetVerticalSmallScrollIncrease();
double newVerticalOffset = _verticalOffset + this.DisplayData.PendingVerticalScrollHeight;
if (newVerticalOffset > _vScrollBar.Maximum)
{
this.DisplayData.PendingVerticalScrollHeight -= newVerticalOffset - _vScrollBar.Maximum;
}
}
else if (scrollEventType == ScrollEventType.SmallDecrement)
{
if (DoubleUtil.GreaterThan(this.NegVerticalOffset, 0))
{
this.DisplayData.PendingVerticalScrollHeight -= this.NegVerticalOffset;
}
else
{
int previousScrollingSlot = this.GetPreviousVisibleSlot(this.DisplayData.FirstScrollingSlot);
if (previousScrollingSlot >= 0)
{
ScrollSlotIntoView(previousScrollingSlot, false /*scrolledHorizontally*/);
}
return;
}
}
else
{
this.DisplayData.PendingVerticalScrollHeight = _vScrollBar.Value - _verticalOffset;
}
if (!DoubleUtil.IsZero(this.DisplayData.PendingVerticalScrollHeight))
{
// Invalidate so the scroll happens on idle
InvalidateRowsMeasure(false /*invalidateIndividualElements*/);
}
}
finally
{
_verticalScrollChangesIgnored--;
}
}
internal void RefreshRowsAndColumns(bool clearRows)
{
if (_measured)
{
try
{
_noCurrentCellChangeCount++;
if (clearRows)
{
ClearRows(false);
ClearRowGroupHeadersTable();
PopulateRowGroupHeadersTable();
RefreshSlotCounts();
}
if (this.AutoGenerateColumns)
{
// Column auto-generation refreshes the rows too
AutoGenerateColumnsPrivate();
}
foreach (DataGridColumn column in this.ColumnsItemsInternal)
{
// We don't need to refresh the state of AutoGenerated column headers because they're up-to-date
if (!column.IsAutoGenerated && column.HasHeaderCell)
{
column.HeaderCell.ApplyState(false);
}
}
RefreshRows(false /*recycleRows*/, false /*clearRows*/);
if (this.Columns.Count > 0 && this.CurrentColumnIndex == -1)
{
MakeFirstDisplayedCellCurrentCell();
}
else
{
_makeFirstDisplayedCellCurrentCellPending = false;
_desiredCurrentColumnIndex = -1;
FlushCurrentCellChanged();
}
}
finally
{
this.NoCurrentCellChangeCount--;
}
}
else
{
if (clearRows)
{
ClearRows(false /*recycle*/);
}
ClearRowGroupHeadersTable();
PopulateRowGroupHeadersTable();
RefreshSlotCounts();
}
}
internal void ResetColumnHeaderInteractionInfo()
{
DataGridColumnHeaderInteractionInfo interactionInfo = this.ColumnHeaderInteractionInfo;
if (interactionInfo != null)
{
interactionInfo.CapturedPointer = null;
interactionInfo.DragMode = DataGridColumnHeader.DragMode.None;
interactionInfo.DragPointerId = 0;
interactionInfo.DragColumn = null;
interactionInfo.DragStart = null;
interactionInfo.PressedPointerPositionHeaders = null;
interactionInfo.LastPointerPositionHeaders = null;
}
if (this.ColumnHeaders != null)
{
this.ColumnHeaders.DragColumn = null;
this.ColumnHeaders.DragIndicator = null;
this.ColumnHeaders.DropLocationIndicator = null;
}
}
internal bool ScrollSlotIntoView(int columnIndex, int slot, bool forCurrentCellChange, bool forceHorizontalScroll)
{
DiagnosticsDebug.Assert(columnIndex >= 0, "Expected positive columnIndex.");
DiagnosticsDebug.Assert(columnIndex < this.ColumnsItemsInternal.Count, "Expected columnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.DisplayData.FirstDisplayedScrollingCol >= -1, "Expected DisplayData.FirstDisplayedScrollingCol greater than or equal to -1.");
DiagnosticsDebug.Assert(this.DisplayData.FirstDisplayedScrollingCol < this.ColumnsItemsInternal.Count, "Expected smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.DisplayData.LastTotallyDisplayedScrollingCol >= -1, "Expected DisplayData.LastTotallyDisplayedScrollingCol greater than or equal to -1.");
DiagnosticsDebug.Assert(this.DisplayData.LastTotallyDisplayedScrollingCol < this.ColumnsItemsInternal.Count, "Expected DisplayData.LastTotallyDisplayedScrollingCol smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(!IsSlotOutOfBounds(slot), "Expected IsSlotOutOfBounds(slot) is false.");
DiagnosticsDebug.Assert(this.DisplayData.FirstScrollingSlot >= -1, "Expected DisplayData.FirstScrollingSlot greater than or equal to -1.");
DiagnosticsDebug.Assert(this.DisplayData.FirstScrollingSlot < this.SlotCount, "Expected DisplayData.FirstScrollingSlot smaller than SlotCount.");
DiagnosticsDebug.Assert(this.ColumnsItemsInternal[columnIndex].IsVisible, "Expected ColumnsItemsInternal[columnIndex].IsVisible is true.");
if (this.CurrentColumnIndex >= 0 &&
(this.CurrentColumnIndex != columnIndex || this.CurrentSlot != slot))
{
if (!CommitEditForOperation(columnIndex, slot, forCurrentCellChange) || IsInnerCellOutOfBounds(columnIndex, slot))
{
return false;
}
}
double oldHorizontalOffset = this.HorizontalOffset;
bool rowGroupHeadersTableContainsSlot = this.RowGroupHeadersTable.Contains(slot);
// scroll horizontally unless we're on a RowGroupHeader and we're not forcing horizontal scrolling
if ((forceHorizontalScroll || (slot != -1 && !rowGroupHeadersTableContainsSlot)) &&
!ScrollColumnIntoView(columnIndex))
{
return false;
}
// scroll vertically
if (!ScrollSlotIntoView(slot, oldHorizontalOffset != this.HorizontalOffset /*scrolledHorizontally*/))
{
return false;
}
// Scrolling horizontally or vertically could cause less rows to be displayed
this.DisplayData.FullyRecycleElements();
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationScrollEvents();
}
return true;
}
// Convenient overload that commits the current edit.
internal bool SetCurrentCellCore(int columnIndex, int slot)
{
return SetCurrentCellCore(columnIndex, slot, true /*commitEdit*/, true /*endRowEdit*/);
}
internal void UpdateHorizontalOffset(double newValue)
{
if (this.HorizontalOffset != newValue)
{
this.HorizontalOffset = newValue;
InvalidateColumnHeadersMeasure();
InvalidateRowsMeasure(true);
}
}
internal bool UpdateSelectionAndCurrency(int columnIndex, int slot, DataGridSelectionAction action, bool scrollIntoView)
{
_successfullyUpdatedSelection = false;
_noSelectionChangeCount++;
_noCurrentCellChangeCount++;
try
{
if (this.ColumnsInternal.RowGroupSpacerColumn.IsRepresented &&
columnIndex == this.ColumnsInternal.RowGroupSpacerColumn.Index)
{
columnIndex = -1;
}
if (IsSlotOutOfSelectionBounds(slot) || (columnIndex != -1 && IsColumnOutOfBounds(columnIndex)))
{
return false;
}
int newCurrentPosition = -1;
object item = ItemFromSlot(slot, ref newCurrentPosition);
if (newCurrentPosition == this.DataConnection.NewItemPlaceholderIndex)
{
newCurrentPosition = -1;
}
if (this.EditingRow != null && slot != this.EditingRow.Slot && !CommitEdit(DataGridEditingUnit.Row, true))
{
return false;
}
if (this.DataConnection.CollectionView != null &&
this.DataConnection.CollectionView.CurrentPosition != newCurrentPosition)
{
this.DataConnection.MoveCurrentTo(item, slot, columnIndex, action, scrollIntoView);
}
else
{
this.ProcessSelectionAndCurrency(columnIndex, item, slot, action, scrollIntoView);
}
}
finally
{
this.NoCurrentCellChangeCount--;
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
internal void UpdateStateOnCurrentChanged(object currentItem, int currentPosition)
{
if ((currentItem == this.CurrentItem) &&
(_isUserSorting || (currentItem == this.SelectedItem && currentPosition == this.SelectedIndex)))
{
// The DataGrid's CurrentItem is already up-to-date, so we don't need to do anything.
// In the sorting case, we receive a CurrentChanged notification if the current item
// changes position in the CollectionView. However, our CurrentItem is already
// in the correct position in this case, and we do not want to update the selection so
// we no-op here.
return;
}
int columnIndex = this.CurrentColumnIndex;
if (columnIndex == -1)
{
if (this.IsColumnOutOfBounds(_desiredCurrentColumnIndex))
{
columnIndex = this.FirstDisplayedNonFillerColumnIndex;
}
else if (this.ColumnsInternal.RowGroupSpacerColumn.IsRepresented && _desiredCurrentColumnIndex == this.ColumnsInternal.RowGroupSpacerColumn.Index)
{
columnIndex = this.FirstDisplayedNonFillerColumnIndex;
}
else
{
columnIndex = _desiredCurrentColumnIndex;
}
}
// The CollectionView will potentially raise multiple CurrentChanged events during a single
// add operation, so we should avoid resetting our desired column index until it's committed.
if (!this.DataConnection.IsAddingNew)
{
_desiredCurrentColumnIndex = -1;
}
try
{
_noSelectionChangeCount++;
_noCurrentCellChangeCount++;
if (!this.CommitEdit())
{
this.CancelEdit(DataGridEditingUnit.Row, false);
}
this.ClearRowSelection(true);
if (currentItem == null)
{
SetCurrentCellCore(-1, -1);
}
else
{
int slot = SlotFromRowIndex(currentPosition);
this.ProcessSelectionAndCurrency(columnIndex, currentItem, slot, DataGridSelectionAction.SelectCurrent, false);
}
}
finally
{
this.NoCurrentCellChangeCount--;
this.NoSelectionChangeCount--;
}
}
internal bool UpdateStateOnTapped(TappedRoutedEventArgs args, int columnIndex, int slot, bool allowEdit)
{
bool ctrl, shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.UpdateStateOnTapped(args, columnIndex, slot, allowEdit, shift, ctrl);
}
internal void UpdateVerticalScrollBar()
{
if (_vScrollBar != null && _vScrollBar.Visibility == Visibility.Visible)
{
double cellsHeight = this.CellsHeight;
double edgedRowsHeightCalculated = this.EdgedRowsHeightCalculated;
UpdateVerticalScrollBar(
edgedRowsHeightCalculated > cellsHeight /*needVertScrollBar*/,
this.VerticalScrollBarVisibility == ScrollBarVisibility.Visible /*forceVertScrollBar*/,
edgedRowsHeightCalculated,
cellsHeight);
}
}
/// <summary>
/// If the editing element has focus, this method will set focus to the DataGrid itself
/// in order to force the element to lose focus. It will then wait for the editing element's
/// LostFocus event, at which point it will perform the specified action.
/// NOTE: It is important to understand that the specified action will be performed when the editing
/// element loses focus only if this method returns true. If it returns false, then the action
/// will not be performed later on, and should instead be performed by the caller, if necessary.
/// </summary>
/// <param name="action">Action to perform after the editing element loses focus</param>
/// <returns>True if the editing element had focus and the action was cached away; false otherwise</returns>
internal bool WaitForLostFocus(Action action)
{
if (this.EditingRow != null && this.EditingColumnIndex != -1 && !_executingLostFocusActions)
{
DataGridColumn editingColumn = this.ColumnsItemsInternal[this.EditingColumnIndex];
FrameworkElement editingElement = editingColumn.GetCellContent(this.EditingRow);
if (editingElement != null && editingElement.ContainsChild(_focusedObject))
{
DiagnosticsDebug.Assert(_lostFocusActions != null, "Expected non-null _lostFocusActions.");
_lostFocusActions.Enqueue(action);
editingElement.LostFocus += new RoutedEventHandler(EditingElement_LostFocus);
this.IsTabStop = true;
this.Focus(FocusState.Programmatic);
return true;
}
}
return false;
}
// Applies the given Style to the Row if it's supposed to use DataGrid.RowStyle
private static void EnsureElementStyle(FrameworkElement element, Style oldDataGridStyle, Style newDataGridStyle)
{
DiagnosticsDebug.Assert(element != null, "Expected non-null element.");
// Apply the DataGrid style if the row was using the old DataGridRowStyle before
if (element != null && (element.Style == null || element.Style == oldDataGridStyle))
{
element.SetStyleWithType(newDataGridStyle);
}
}
private bool AddNewItem(RoutedEventArgs editingEventArgs)
{
#if FEATURE_IEDITABLECOLLECTIONVIEW
if (this.DataConnection.EditableCollectionView != null && this.DataConnection.EditableCollectionView.CanAddNew)
{
_desiredCurrentColumnIndex = this.CurrentColumnIndex;
object addItem = this.DataConnection.EditableCollectionView.AddNew();
if (this.CurrentItem != this.DataConnection.EditableCollectionView.CurrentAddItem)
{
int newItemSlot = SlotFromRowIndex(this.DataConnection.IndexOf(addItem));
SetAndSelectCurrentCell(this.CurrentColumnIndex, newItemSlot, true);
if (!_successfullyUpdatedSelection)
{
return false;
}
}
return BeginCellEdit(editingEventArgs);
}
#endif
return false;
}
private void AddNewCellPrivate(DataGridRow row, DataGridColumn column)
{
DataGridCell newCell = new DataGridCell();
PopulateCellContent(false /*isCellEdited*/, column, row, newCell);
if (row.OwningGrid != null)
{
newCell.OwningColumn = column;
newCell.Visibility = column.Visibility;
}
if (column is DataGridFillerColumn)
{
Windows.UI.Xaml.Automation.AutomationProperties.SetAccessibilityView(
newCell,
AccessibilityView.Raw);
}
newCell.EnsureStyle(null);
row.Cells.Insert(column.Index, newCell);
}
// TODO: Call this method once the UISettings has a public property for the "Automatically hide scroll bars in Windows" setting
// private void AutoHideScrollBarsChanged()
// {
// if (UISettingsHelper.AreSettingsAutoHidingScrollBars)
// {
// SwitchScrollBarsVisualStates(_proposedScrollBarsState, _proposedScrollBarsSeparatorState, true /*useTransitions*/);
// }
// else
// {
// if (this.AreBothScrollBarsVisible)
// {
// if (UISettingsHelper.AreSettingsEnablingAnimations)
// {
// SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicatorFull, this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorExpanded : ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
// }
// else
// {
// SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicatorFull, this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorExpandedWithoutAnimation : ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
// }
// }
// else
// {
// if (UISettingsHelper.AreSettingsEnablingAnimations)
// {
// SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicator, ScrollBarsSeparatorVisualState.SeparatorCollapsed, true/*useTransitions*/);
// }
// else
// {
// SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicator, this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorCollapsedWithoutAnimation : ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
// }
// }
// }
// }
private bool BeginCellEdit(RoutedEventArgs editingEventArgs)
{
if (this.CurrentColumnIndex == -1 || !GetRowSelection(this.CurrentSlot))
{
return false;
}
DiagnosticsDebug.Assert(this.CurrentColumnIndex >= 0, "Expected positive CurrentColumnIndex.");
DiagnosticsDebug.Assert(this.CurrentColumnIndex < this.ColumnsItemsInternal.Count, "Expected CurrentColumnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.CurrentSlot >= -1, "Expected CurrentSlot greater than or equal to -1.");
DiagnosticsDebug.Assert(this.CurrentSlot < this.SlotCount, "Expected CurrentSlot smaller than SlotCount.");
DiagnosticsDebug.Assert(this.EditingRow == null || this.EditingRow.Slot == this.CurrentSlot, "Expected null EditingRow or EditingRow.Slot equal to CurrentSlot.");
DiagnosticsDebug.Assert(!GetColumnEffectiveReadOnlyState(this.CurrentColumn), "Expected GetColumnEffectiveReadOnlyState(CurrentColumn) is false.");
DiagnosticsDebug.Assert(this.CurrentColumn.IsVisible, "Expected CurrentColumn.IsVisible is true.");
if (_editingColumnIndex != -1)
{
// Current cell is already in edit mode
DiagnosticsDebug.Assert(_editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals CurrentColumnIndex.");
return true;
}
// When we begin edit on the NewItemPlaceHolder row, we should try to add a new item.
if (this.CurrentSlot == SlotFromRowIndex(this.DataConnection.NewItemPlaceholderIndex))
{
return this.AddNewItem(editingEventArgs);
}
// Get or generate the editing row if it doesn't exist
DataGridRow dataGridRow = this.EditingRow;
if (dataGridRow == null)
{
DiagnosticsDebug.Assert(!this.RowGroupHeadersTable.Contains(this.CurrentSlot), "Expected CurrentSlot not contained in RowGroupHeadersTable.");
if (this.IsSlotVisible(this.CurrentSlot))
{
dataGridRow = this.DisplayData.GetDisplayedElement(this.CurrentSlot) as DataGridRow;
DiagnosticsDebug.Assert(dataGridRow != null, "Expected non-null dataGridRow.");
}
else
{
dataGridRow = GenerateRow(RowIndexFromSlot(this.CurrentSlot), this.CurrentSlot);
dataGridRow.Clip = new RectangleGeometry();
}
if (this.DataConnection.IsAddingNew)
{
// We just began editing the new item row, so set a flag that prevents us from running
// full entity validation until the user explicitly attempts to end editing the row.
_initializingNewItem = true;
}
}
DiagnosticsDebug.Assert(dataGridRow != null, "Expected non-null dataGridRow.");
// Cache these to see if they change later
int currentRowIndex = this.CurrentSlot;
int currentColumnIndex = this.CurrentColumnIndex;
// Raise the BeginningEdit event
DataGridCell dataGridCell = dataGridRow.Cells[this.CurrentColumnIndex];
DataGridBeginningEditEventArgs e = new DataGridBeginningEditEventArgs(this.CurrentColumn, dataGridRow, editingEventArgs);
OnBeginningEdit(e);
if (e.Cancel ||
currentRowIndex != this.CurrentSlot ||
currentColumnIndex != this.CurrentColumnIndex ||
!GetRowSelection(this.CurrentSlot) ||
(this.EditingRow == null && !BeginRowEdit(dataGridRow)))
{
// If either BeginningEdit was canceled, currency/selection was changed in the event handler,
// or we failed opening the row for edit, then we can no longer continue BeginCellEdit
return false;
}
if (this.EditingRow == null || this.EditingRow.Slot != this.CurrentSlot)
{
// This check was added to safeguard against a ListCollectionView bug where the collection changed currency
// during a CommitNew operation but failed to raise a CurrentChanged event.
return false;
}
// Finally, we can prepare the cell for editing
_editingColumnIndex = this.CurrentColumnIndex;
_editingEventArgs = editingEventArgs;
this.EditingRow.Cells[this.CurrentColumnIndex].ApplyCellState(true /*animate*/);
PopulateCellContent(true /*isCellEdited*/, this.CurrentColumn, dataGridRow, dataGridCell);
return true;
}
private bool BeginRowEdit(DataGridRow dataGridRow)
{
DiagnosticsDebug.Assert(this.EditingRow == null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(dataGridRow != null, "Expected non-null dataGridRow.");
DiagnosticsDebug.Assert(this.CurrentSlot >= -1, "Expected CurrentSlot greater than or equal to -1.");
DiagnosticsDebug.Assert(this.CurrentSlot < this.SlotCount, "Expected CurrentSlot smaller than SlotCount.");
if (this.DataConnection.BeginEdit(dataGridRow.DataContext))
{
this.EditingRow = dataGridRow;
this.GenerateEditingElements();
this.ValidateEditingRow(false /*scrollIntoView*/, true /*wireEvents*/);
// Raise the automation invoke event for the row that just began edit
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null && AutomationPeer.ListenerExists(AutomationEvents.InvokePatternOnInvoked))
{
peer.RaiseAutomationInvokeEvents(DataGridEditingUnit.Row, null, dataGridRow);
}
return true;
}
return false;
}
private bool CancelRowEdit(bool exitEditingMode)
{
if (this.EditingRow == null)
{
return true;
}
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Index >= -1, "Expected EditingRow greater or equal to -1.");
DiagnosticsDebug.Assert(this.EditingRow.Slot < this.SlotCount, "Expected EditingRow smaller than SlotCount.");
DiagnosticsDebug.Assert(this.CurrentColumn != null, "Expected non-null CurrentColumn.");
object dataItem = this.EditingRow.DataContext;
if (!this.DataConnection.CancelEdit(dataItem))
{
return false;
}
foreach (DataGridColumn column in this.Columns)
{
if (!exitEditingMode && column.Index == _editingColumnIndex && column is DataGridBoundColumn)
{
continue;
}
PopulateCellContent(!exitEditingMode && column.Index == _editingColumnIndex /*isCellEdited*/, column, this.EditingRow, this.EditingRow.Cells[column.Index]);
}
return true;
}
private bool CommitEditForOperation(int columnIndex, int slot, bool forCurrentCellChange)
{
if (forCurrentCellChange)
{
if (!EndCellEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, true /*keepFocus*/, true /*raiseEvents*/))
{
return false;
}
if (this.CurrentSlot != slot &&
!EndRowEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, true /*raiseEvents*/))
{
return false;
}
}
if (IsColumnOutOfBounds(columnIndex))
{
return false;
}
if (slot >= this.SlotCount)
{
// Current cell was reset because the commit deleted row(s).
// Since the user wants to change the current cell, we don't
// want to end up with no current cell. We pick the last row
// in the grid which may be the 'new row'.
int lastSlot = this.LastVisibleSlot;
if (forCurrentCellChange &&
this.CurrentColumnIndex == -1 &&
lastSlot != -1)
{
SetAndSelectCurrentCell(columnIndex, lastSlot, false /*forceCurrentCellSelection (unused here)*/);
}
// Interrupt operation because it has become invalid.
return false;
}
return true;
}
private bool CommitRowEdit(bool exitEditingMode)
{
if (this.EditingRow == null)
{
return true;
}
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Index >= -1, "Expected EditingRow.Index greater than or equal to -1.");
DiagnosticsDebug.Assert(this.EditingRow.Slot < this.SlotCount, "Expected EditingRow.Slot smaller than SlotCount.");
if (!ValidateEditingRow(true /*scrollIntoView*/, false /*wireEvents*/))
{
return false;
}
this.DataConnection.EndEdit(this.EditingRow.DataContext);
if (!exitEditingMode)
{
this.DataConnection.BeginEdit(this.EditingRow.DataContext);
}
return true;
}
private void CompleteCellsCollection(DataGridRow dataGridRow)
{
DiagnosticsDebug.Assert(dataGridRow != null, "Expected non-null dataGridRow.");
int cellsInCollection = dataGridRow.Cells.Count;
if (this.ColumnsItemsInternal.Count > cellsInCollection)
{
for (int columnIndex = cellsInCollection; columnIndex < this.ColumnsItemsInternal.Count; columnIndex++)
{
AddNewCellPrivate(dataGridRow, this.ColumnsItemsInternal[columnIndex]);
}
}
}
private void ComputeScrollBarsLayout()
{
if (_ignoreNextScrollBarsLayout)
{
_ignoreNextScrollBarsLayout = false;
// TODO: This optimization is causing problems with initial layout:
// Investigate why horizontal ScrollBar sometimes has incorrect thumb size when
// it first appears after adding a row when this perf improvement is turned on.
// return;
}
bool isHorizontalScrollBarOverCells = this.IsHorizontalScrollBarOverCells;
bool isVerticalScrollBarOverCells = this.IsVerticalScrollBarOverCells;
double cellsWidth = this.CellsWidth;
double cellsHeight = this.CellsHeight;
bool allowHorizScrollBar = false;
bool forceHorizScrollBar = false;
double horizScrollBarHeight = 0;
if (_hScrollBar != null)
{
forceHorizScrollBar = this.HorizontalScrollBarVisibility == ScrollBarVisibility.Visible;
allowHorizScrollBar = forceHorizScrollBar || (this.ColumnsInternal.VisibleColumnCount > 0 &&
this.HorizontalScrollBarVisibility != ScrollBarVisibility.Disabled &&
this.HorizontalScrollBarVisibility != ScrollBarVisibility.Hidden);
// Compensate if the horizontal scrollbar is already taking up space
if (!forceHorizScrollBar && _hScrollBar.Visibility == Visibility.Visible)
{
if (!isHorizontalScrollBarOverCells)
{
cellsHeight += _hScrollBar.DesiredSize.Height;
}
}
if (!isHorizontalScrollBarOverCells)
{
horizScrollBarHeight = _hScrollBar.Height + _hScrollBar.Margin.Top + _hScrollBar.Margin.Bottom;
}
}
bool allowVertScrollBar = false;
bool forceVertScrollBar = false;
double vertScrollBarWidth = 0;
if (_vScrollBar != null)
{
forceVertScrollBar = this.VerticalScrollBarVisibility == ScrollBarVisibility.Visible;
allowVertScrollBar = forceVertScrollBar || (this.ColumnsItemsInternal.Count > 0 &&
this.VerticalScrollBarVisibility != ScrollBarVisibility.Disabled &&
this.VerticalScrollBarVisibility != ScrollBarVisibility.Hidden);
// Compensate if the vertical scrollbar is already taking up space
if (!forceVertScrollBar && _vScrollBar.Visibility == Visibility.Visible)
{
if (!isVerticalScrollBarOverCells)
{
cellsWidth += _vScrollBar.DesiredSize.Width;
}
}
if (!isVerticalScrollBarOverCells)
{
vertScrollBarWidth = _vScrollBar.Width + _vScrollBar.Margin.Left + _vScrollBar.Margin.Right;
}
}
// Now cellsWidth is the width potentially available for displaying data cells.
// Now cellsHeight is the height potentially available for displaying data cells.
bool needHorizScrollBar = false;
bool needVertScrollBar = false;
double totalVisibleWidth = this.ColumnsInternal.VisibleEdgedColumnsWidth;
double totalVisibleFrozenWidth = this.ColumnsInternal.GetVisibleFrozenEdgedColumnsWidth();
UpdateDisplayedRows(this.DisplayData.FirstScrollingSlot, this.CellsHeight);
double totalVisibleHeight = this.EdgedRowsHeightCalculated;
if (!forceHorizScrollBar && !forceVertScrollBar)
{
bool needHorizScrollBarWithoutVertScrollBar = false;
if (allowHorizScrollBar &&
DoubleUtil.GreaterThan(totalVisibleWidth, cellsWidth) &&
DoubleUtil.LessThan(totalVisibleFrozenWidth, cellsWidth) &&
DoubleUtil.LessThanOrClose(horizScrollBarHeight, cellsHeight))
{
double oldDataHeight = cellsHeight;
cellsHeight -= horizScrollBarHeight;
DiagnosticsDebug.Assert(cellsHeight >= 0, "Expected positive cellsHeight.");
needHorizScrollBarWithoutVertScrollBar = needHorizScrollBar = true;
if (vertScrollBarWidth > 0 &&
allowVertScrollBar &&
(DoubleUtil.LessThanOrClose(totalVisibleWidth - cellsWidth, vertScrollBarWidth) || DoubleUtil.LessThanOrClose(cellsWidth - totalVisibleFrozenWidth, vertScrollBarWidth)))
{
// Would we still need a horizontal scrollbar without the vertical one?
UpdateDisplayedRows(this.DisplayData.FirstScrollingSlot, cellsHeight);
if (this.DisplayData.NumTotallyDisplayedScrollingElements != this.VisibleSlotCount)
{
needHorizScrollBar = DoubleUtil.LessThan(totalVisibleFrozenWidth, cellsWidth - vertScrollBarWidth);
}
if (!needHorizScrollBar)
{
// Restore old data height because turns out a horizontal scroll bar wouldn't make sense
cellsHeight = oldDataHeight;
}
}
}
// Store the current FirstScrollingSlot because removing the horizontal scrollbar could scroll
// the DataGrid up; however, if we realize later that we need to keep the horizontal scrollbar
// then we should use the first slot stored here which is not scrolled.
int firstScrollingSlot = this.DisplayData.FirstScrollingSlot;
UpdateDisplayedRows(firstScrollingSlot, cellsHeight);
if (allowVertScrollBar &&
DoubleUtil.GreaterThan(cellsHeight, 0) &&
DoubleUtil.LessThanOrClose(vertScrollBarWidth, cellsWidth) &&
this.DisplayData.NumTotallyDisplayedScrollingElements != this.VisibleSlotCount)
{
cellsWidth -= vertScrollBarWidth;
DiagnosticsDebug.Assert(cellsWidth >= 0, "Expected positive cellsWidth.");
needVertScrollBar = true;
}
this.DisplayData.FirstDisplayedScrollingCol = ComputeFirstVisibleScrollingColumn();
// We compute the number of visible columns only after we set up the vertical scroll bar.
ComputeDisplayedColumns();
if ((vertScrollBarWidth > 0 || horizScrollBarHeight > 0) &&
allowHorizScrollBar &&
needVertScrollBar && !needHorizScrollBar &&
DoubleUtil.GreaterThan(totalVisibleWidth, cellsWidth) &&
DoubleUtil.LessThan(totalVisibleFrozenWidth, cellsWidth) &&
DoubleUtil.LessThanOrClose(horizScrollBarHeight, cellsHeight))
{
cellsWidth += vertScrollBarWidth;
cellsHeight -= horizScrollBarHeight;
DiagnosticsDebug.Assert(cellsHeight >= 0, "Expected positive cellsHeight.");
needVertScrollBar = false;
UpdateDisplayedRows(firstScrollingSlot, cellsHeight);
if (cellsHeight > 0 &&
vertScrollBarWidth <= cellsWidth &&
this.DisplayData.NumTotallyDisplayedScrollingElements != this.VisibleSlotCount)
{
cellsWidth -= vertScrollBarWidth;
DiagnosticsDebug.Assert(cellsWidth >= 0, "Expected positive cellsWidth.");
needVertScrollBar = true;
}
if (needVertScrollBar)
{
needHorizScrollBar = true;
}
else
{
needHorizScrollBar = needHorizScrollBarWithoutVertScrollBar;
}
}
}
else if (forceHorizScrollBar && !forceVertScrollBar)
{
if (allowVertScrollBar)
{
if (cellsHeight > 0 &&
DoubleUtil.LessThanOrClose(vertScrollBarWidth, cellsWidth) &&
this.DisplayData.NumTotallyDisplayedScrollingElements != this.VisibleSlotCount)
{
cellsWidth -= vertScrollBarWidth;
DiagnosticsDebug.Assert(cellsWidth >= 0, "Expected positive cellsWidth.");
needVertScrollBar = true;
}
this.DisplayData.FirstDisplayedScrollingCol = ComputeFirstVisibleScrollingColumn();
ComputeDisplayedColumns();
}
needHorizScrollBar = totalVisibleWidth > cellsWidth && totalVisibleFrozenWidth < cellsWidth;
}
else if (!forceHorizScrollBar && forceVertScrollBar)
{
if (allowHorizScrollBar)
{
if (cellsWidth > 0 &&
DoubleUtil.LessThanOrClose(horizScrollBarHeight, cellsHeight) &&
DoubleUtil.GreaterThan(totalVisibleWidth, cellsWidth) &&
DoubleUtil.LessThan(totalVisibleFrozenWidth, cellsWidth))
{
cellsHeight -= horizScrollBarHeight;
DiagnosticsDebug.Assert(cellsHeight >= 0, "Expected positive cellsHeight.");
needHorizScrollBar = true;
UpdateDisplayedRows(this.DisplayData.FirstScrollingSlot, cellsHeight);
}
this.DisplayData.FirstDisplayedScrollingCol = ComputeFirstVisibleScrollingColumn();
ComputeDisplayedColumns();
}
needVertScrollBar = this.DisplayData.NumTotallyDisplayedScrollingElements != this.VisibleSlotCount;
}
else
{
DiagnosticsDebug.Assert(forceHorizScrollBar, "Expected forceHorizScrollBar is true.");
DiagnosticsDebug.Assert(forceVertScrollBar, "Expected forceVertScrollBar is true.");
DiagnosticsDebug.Assert(allowHorizScrollBar, "Expected allowHorizScrollBar is true.");
DiagnosticsDebug.Assert(allowVertScrollBar, "Expected allowVertScrollBar is true.");
this.DisplayData.FirstDisplayedScrollingCol = ComputeFirstVisibleScrollingColumn();
ComputeDisplayedColumns();
needVertScrollBar = this.DisplayData.NumTotallyDisplayedScrollingElements != this.VisibleSlotCount;
needHorizScrollBar = totalVisibleWidth > cellsWidth && totalVisibleFrozenWidth < cellsWidth;
}
UpdateHorizontalScrollBar(needHorizScrollBar, forceHorizScrollBar, totalVisibleWidth, totalVisibleFrozenWidth, cellsWidth);
UpdateVerticalScrollBar(needVertScrollBar, forceVertScrollBar, totalVisibleHeight, cellsHeight);
if (_topRightCornerHeader != null)
{
// Show the TopRightHeaderCell based on vertical ScrollBar visibility
if (this.AreColumnHeadersVisible &&
_vScrollBar != null && _vScrollBar.Visibility == Visibility.Visible)
{
_topRightCornerHeader.Visibility = Visibility.Visible;
}
else
{
_topRightCornerHeader.Visibility = Visibility.Collapsed;
}
}
if (_bottomRightCorner != null)
{
// Show the BottomRightCorner when both scrollbars are visible.
_bottomRightCorner.Visibility =
_hScrollBar != null && _hScrollBar.Visibility == Visibility.Visible &&
_vScrollBar != null && _vScrollBar.Visibility == Visibility.Visible ?
Visibility.Visible : Visibility.Collapsed;
}
this.DisplayData.FullyRecycleElements();
}
#if FEATURE_VALIDATION_SUMMARY
/// <summary>
/// Create an ValidationSummaryItem for a given ValidationResult, by finding all cells related to the
/// validation error and adding them as separate ValidationSummaryItemSources.
/// </summary>
/// <param name="validationResult">ValidationResult</param>
/// <returns>ValidationSummaryItem</returns>
private ValidationSummaryItem CreateValidationSummaryItem(ValidationResult validationResult)
{
DiagnosticsDebug.Assert(validationResult != null);
DiagnosticsDebug.Assert(_validationSummary != null);
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
ValidationSummaryItem validationSummaryItem = new ValidationSummaryItem(validationResult.ErrorMessage);
validationSummaryItem.Context = validationResult;
string messageHeader = null;
foreach (DataGridColumn column in this.ColumnsInternal.GetDisplayedColumns(c => c.IsVisible && !c.IsReadOnly))
{
foreach (string property in validationResult.MemberNames)
{
if (!string.IsNullOrEmpty(property) && column.BindingPaths.Contains(property))
{
validationSummaryItem.Sources.Add(new ValidationSummaryItemSource(property, this.EditingRow.Cells[column.Index]));
if (string.IsNullOrEmpty(messageHeader) && column.Header != null)
{
messageHeader = column.Header.ToString();
}
}
}
}
DiagnosticsDebug.Assert(validationSummaryItem.ItemType == ValidationSummaryItemType.ObjectError);
if (_propertyValidationResults.ContainsEqualValidationResult(validationResult))
{
validationSummaryItem.MessageHeader = messageHeader;
validationSummaryItem.ItemType = ValidationSummaryItemType.PropertyError;
}
return validationSummaryItem;
}
#endif
/// <summary>
/// Handles the current editing element's LostFocus event by performing any actions that
/// were cached by the WaitForLostFocus method.
/// </summary>
/// <param name="sender">Editing element</param>
/// <param name="e">RoutedEventArgs</param>
private void EditingElement_LostFocus(object sender, RoutedEventArgs e)
{
FrameworkElement editingElement = sender as FrameworkElement;
if (editingElement != null)
{
editingElement.LostFocus -= new RoutedEventHandler(EditingElement_LostFocus);
if (this.EditingRow != null && this.EditingColumnIndex != -1)
{
this.FocusEditingCell(true);
}
DiagnosticsDebug.Assert(_lostFocusActions != null, "Expected non-null _lostFocusActions.");
try
{
_executingLostFocusActions = true;
while (_lostFocusActions.Count > 0)
{
_lostFocusActions.Dequeue()();
}
}
finally
{
_executingLostFocusActions = false;
}
}
}
// Makes sure horizontal layout is updated to reflect any changes that affect it
private void EnsureHorizontalLayout()
{
this.ColumnsInternal.EnsureVisibleEdgedColumnsWidth();
InvalidateColumnHeadersMeasure();
InvalidateRowsMeasure(true);
InvalidateMeasure();
}
/// <summary>
/// Ensures that the RowHeader widths are properly sized and invalidates them if they are not
/// </summary>
/// <returns>True if a RowHeader or RowGroupHeader was invalidated</returns>
private bool EnsureRowHeaderWidth()
{
bool invalidated = false;
if (this.AreRowHeadersVisible)
{
if (this.AreColumnHeadersVisible)
{
EnsureTopLeftCornerHeader();
}
if (_rowsPresenter != null)
{
foreach (UIElement element in _rowsPresenter.Children)
{
DataGridRow row = element as DataGridRow;
if (row != null)
{
// If the RowHeader resulted in a different width the last time it was measured, we need
// to re-measure it
if (row.HeaderCell != null && row.HeaderCell.DesiredSize.Width != this.ActualRowHeaderWidth)
{
row.HeaderCell.InvalidateMeasure();
invalidated = true;
}
}
else
{
DataGridRowGroupHeader groupHeader = element as DataGridRowGroupHeader;
if (groupHeader != null && groupHeader.HeaderCell != null && groupHeader.HeaderCell.DesiredSize.Width != this.ActualRowHeaderWidth)
{
groupHeader.HeaderCell.InvalidateMeasure();
invalidated = true;
}
}
}
if (invalidated)
{
// We need to update the width of the horizontal scrollbar if the rowHeaders' width actually changed
if (this.ColumnsInternal.VisibleStarColumnCount > 0)
{
this.ColumnsInternal.EnsureVisibleEdgedColumnsWidth();
}
InvalidateMeasure();
}
}
}
return invalidated;
}
private void EnsureRowsPresenterVisibility()
{
if (_rowsPresenter != null)
{
// RowCount doesn't need to be considered, doing so might cause extra Visibility changes
_rowsPresenter.Visibility = this.ColumnsInternal.FirstVisibleNonFillerColumn == null ? Visibility.Collapsed : Visibility.Visible;
}
}
private void EnsureTopLeftCornerHeader()
{
if (_topLeftCornerHeader != null)
{
_topLeftCornerHeader.Visibility = this.HeadersVisibility == DataGridHeadersVisibility.All ? Visibility.Visible : Visibility.Collapsed;
if (_topLeftCornerHeader.Visibility == Visibility.Visible)
{
if (!double.IsNaN(this.RowHeaderWidth))
{
// RowHeaderWidth is set explicitly so we should use that
_topLeftCornerHeader.Width = this.RowHeaderWidth;
}
else if (this.VisibleSlotCount > 0)
{
// RowHeaders AutoSize and we have at least 1 row so take the desired width
_topLeftCornerHeader.Width = this.RowHeadersDesiredWidth;
}
}
}
}
#if FEATURE_VALIDATION_SUMMARY
/// <summary>
/// Handles the ValidationSummary's FocusingInvalidControl event and begins edit on the cells
/// that are associated with the selected error.
/// </summary>
/// <param name="sender">ValidationSummary</param>
/// <param name="e">FocusingInvalidControlEventArgs</param>
private void ValidationSummary_FocusingInvalidControl(object sender, FocusingInvalidControlEventArgs e)
{
DiagnosticsDebug.Assert(_validationSummary != null);
if (this.EditingRow == null || this.IsSlotOutOfBounds(this.EditingRow.Slot) || this.EditingRow.Slot == -1 || !ScrollSlotIntoView(this.EditingRow.Slot, false /*scrolledHorizontally*/))
{
return;
}
// We need to focus the DataGrid in case the focused element gets removed when we end edit.
if ((_editingColumnIndex == -1 || (this.Focus(FocusState.Programmatic) && EndCellEdit(DataGridEditAction.Commit, true, true, true)))
&& e.Item != null && e.Target != null && _validationSummary.Errors.Contains(e.Item))
{
DataGridCell cell = e.Target.Control as DataGridCell;
if (cell != null && cell.OwningGrid == this && cell.OwningColumn != null && cell.OwningColumn.IsVisible)
{
DiagnosticsDebug.Assert(cell.ColumnIndex >= 0 && cell.ColumnIndex < this.ColumnsInternal.Count);
// Begin editing the next relevant cell
UpdateSelectionAndCurrency(cell.ColumnIndex, this.EditingRow.Slot, DataGridSelectionAction.None, true /*scrollIntoView*/);
if (_successfullyUpdatedSelection)
{
BeginCellEdit(new RoutedEventArgs());
if (!IsColumnDisplayed(this.CurrentColumnIndex))
{
ScrollColumnIntoView(this.CurrentColumnIndex);
}
}
}
e.Handled = true;
}
}
/// <summary>
/// Handles the ValidationSummary's SelectionChanged event and changes which cells are displayed as invalid.
/// </summary>
/// <param name="sender">ValidationSummary</param>
/// <param name="e">SelectionChangedEventArgs</param>
private void ValidationSummary_SelectionChanged(object sender, SelectionChangedEventArgs e)
{
// ValidationSummary only supports single-selection mode.
if (e.AddedItems.Count == 1)
{
_selectedValidationSummaryItem = e.AddedItems[0] as ValidationSummaryItem;
}
this.UpdateValidationStatus();
}
#endif
// Recursively expands parent RowGroupHeaders from the top down
private void ExpandRowGroupParentChain(int level, int slot)
{
if (level < 0)
{
return;
}
int previousHeaderSlot = this.RowGroupHeadersTable.GetPreviousIndex(slot + 1);
while (previousHeaderSlot >= 0)
{
DataGridRowGroupInfo rowGroupInfo = this.RowGroupHeadersTable.GetValueAt(previousHeaderSlot);
DiagnosticsDebug.Assert(rowGroupInfo != null, "Expected non-null rowGroupInfo.");
if (level == rowGroupInfo.Level)
{
if (_collapsedSlotsTable.Contains(rowGroupInfo.Slot))
{
// Keep going up the chain
ExpandRowGroupParentChain(level - 1, rowGroupInfo.Slot - 1);
}
if (rowGroupInfo.Visibility != Visibility.Visible)
{
EnsureRowGroupVisibility(rowGroupInfo, Visibility.Visible, false);
}
return;
}
else
{
previousHeaderSlot = this.RowGroupHeadersTable.GetPreviousIndex(previousHeaderSlot);
}
}
}
#if FEATURE_VALIDATION_SUMMARY
/// <summary>
/// Searches through the DataGrid's ValidationSummary for any errors that use the given
/// ValidationResult as the ValidationSummaryItem's Context value.
/// </summary>
/// <param name="context">ValidationResult</param>
/// <returns>ValidationSummaryItem or null if not found</returns>
private ValidationSummaryItem FindValidationSummaryItem(ValidationResult context)
{
DiagnosticsDebug.Assert(context != null);
DiagnosticsDebug.Assert(_validationSummary != null);
foreach (ValidationSummaryItem ValidationSummaryItem in _validationSummary.Errors)
{
if (context.Equals(ValidationSummaryItem.Context))
{
return ValidationSummaryItem;
}
}
return null;
}
#endif
private void InvalidateCellsArrange()
{
foreach (DataGridRow row in GetAllRows())
{
row.InvalidateHorizontalArrange();
}
}
private void InvalidateColumnHeadersArrange()
{
if (_columnHeadersPresenter != null)
{
_columnHeadersPresenter.InvalidateArrange();
}
}
private void InvalidateColumnHeadersMeasure()
{
if (_columnHeadersPresenter != null)
{
EnsureColumnHeadersVisibility();
_columnHeadersPresenter.InvalidateMeasure();
}
}
private void InvalidateRowsArrange()
{
if (_rowsPresenter != null)
{
_rowsPresenter.InvalidateArrange();
}
}
private void InvalidateRowsMeasure(bool invalidateIndividualElements)
{
if (_rowsPresenter != null)
{
_rowsPresenter.InvalidateMeasure();
if (invalidateIndividualElements)
{
foreach (UIElement element in _rowsPresenter.Children)
{
element.InvalidateMeasure();
}
}
}
}
private void DataGrid_GettingFocus(UIElement sender, GettingFocusEventArgs e)
{
_focusInputDevice = e.InputDevice;
}
private void DataGrid_GotFocus(object sender, RoutedEventArgs e)
{
if (!this.ContainsFocus)
{
this.ContainsFocus = true;
ApplyDisplayedRowsState(this.DisplayData.FirstScrollingSlot, this.DisplayData.LastScrollingSlot);
if (this.CurrentColumnIndex != -1 && this.IsSlotVisible(this.CurrentSlot))
{
UpdateCurrentState(this.DisplayData.GetDisplayedElement(this.CurrentSlot), this.CurrentColumnIndex, true /*applyCellState*/);
}
}
DependencyObject focusedElement = e.OriginalSource as DependencyObject;
_focusedObject = focusedElement;
while (focusedElement != null)
{
// Keep track of which row contains the newly focused element
var focusedRow = focusedElement as DataGridRow;
if (focusedRow != null && focusedRow.OwningGrid == this && _focusedRow != focusedRow)
{
ResetFocusedRow();
_focusedRow = focusedRow.Visibility == Visibility.Visible ? focusedRow : null;
break;
}
focusedElement = VisualTreeHelper.GetParent(focusedElement);
}
_preferMouseIndicators = _focusInputDevice == FocusInputDeviceKind.Mouse || _focusInputDevice == FocusInputDeviceKind.Pen;
ShowScrollBars();
// If the DataGrid itself got focus, we actually want the automation focus to be on the current element
if (e.OriginalSource == this && AutomationPeer.ListenerExists(AutomationEvents.AutomationFocusChanged))
{
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationFocusChangedEvent(this.CurrentSlot, this.CurrentColumnIndex);
}
}
}
private void DataGrid_IsEnabledChanged(object sender, DependencyPropertyChangedEventArgs e)
{
UpdateDisabledVisual();
if (!this.IsEnabled)
{
HideScrollBars(true /*useTransitions*/);
}
}
private void DataGrid_KeyDown(object sender, KeyRoutedEventArgs e)
{
if (!e.Handled)
{
e.Handled = ProcessDataGridKey(e);
this.LastHandledKeyDown = e.Handled ? e.Key : VirtualKey.None;
}
}
private void DataGrid_KeyUp(object sender, KeyRoutedEventArgs e)
{
if (e.Key == VirtualKey.Tab && e.OriginalSource == this)
{
if (this.CurrentColumnIndex == -1)
{
if (this.ColumnHeaders != null && this.AreColumnHeadersVisible && !this.ColumnHeaderHasFocus)
{
this.ColumnHeaderHasFocus = true;
}
}
else
{
if (this.ColumnHeaders != null && this.AreColumnHeadersVisible)
{
KeyboardHelper.GetMetaKeyState(out _, out var shift);
if (shift && this.LastHandledKeyDown != VirtualKey.Tab)
{
DiagnosticsDebug.Assert(!this.ColumnHeaderHasFocus, "Expected ColumnHeaderHasFocus is false.");
// Show currency on the current column's header as focus is entering the DataGrid backwards.
this.ColumnHeaderHasFocus = true;
}
}
bool success = ScrollSlotIntoView(this.CurrentColumnIndex, this.CurrentSlot, false /*forCurrentCellChange*/, true /*forceHorizontalScroll*/);
DiagnosticsDebug.Assert(success, "Expected ScrollSlotIntoView returns true.");
if (this.CurrentColumnIndex != -1 && this.SelectedItem == null)
{
SetRowSelection(this.CurrentSlot, true /*isSelected*/, true /*setAnchorSlot*/);
}
}
}
}
private void DataGrid_LostFocus(object sender, RoutedEventArgs e)
{
_focusedObject = null;
if (this.ContainsFocus)
{
bool focusLeftDataGrid = true;
bool dataGridWillReceiveRoutedEvent = true;
DataGridColumn editingColumn = null;
// Walk up the visual tree of the newly focused element
// to determine if focus is still within DataGrid.
object focusedObject = GetFocusedElement();
DependencyObject focusedDependencyObject = focusedObject as DependencyObject;
while (focusedDependencyObject != null)
{
if (focusedDependencyObject == this)
{
focusLeftDataGrid = false;
break;
}
// Walk up the visual tree. Try using the framework element's
// parent. We do this because Popups behave differently with respect to the visual tree,
// and it could have a parent even if the VisualTreeHelper doesn't find it.
DependencyObject parent = null;
FrameworkElement element = focusedDependencyObject as FrameworkElement;
if (element == null)
{
parent = VisualTreeHelper.GetParent(focusedDependencyObject);
}
else
{
parent = element.Parent;
if (parent == null)
{
parent = VisualTreeHelper.GetParent(focusedDependencyObject);
}
else
{
dataGridWillReceiveRoutedEvent = false;
}
}
focusedDependencyObject = parent;
}
if (this.EditingRow != null && this.EditingColumnIndex != -1)
{
editingColumn = this.ColumnsItemsInternal[this.EditingColumnIndex];
if (focusLeftDataGrid && editingColumn is DataGridTemplateColumn)
{
dataGridWillReceiveRoutedEvent = false;
}
}
if (focusLeftDataGrid && !(editingColumn is DataGridTemplateColumn))
{
this.ContainsFocus = false;
if (this.EditingRow != null)
{
CommitEdit(DataGridEditingUnit.Row, true /*exitEditingMode*/);
}
ResetFocusedRow();
ApplyDisplayedRowsState(this.DisplayData.FirstScrollingSlot, this.DisplayData.LastScrollingSlot);
if (this.ColumnHeaderHasFocus)
{
this.ColumnHeaderHasFocus = false;
}
else if (this.CurrentColumnIndex != -1 && this.IsSlotVisible(this.CurrentSlot))
{
UpdateCurrentState(this.DisplayData.GetDisplayedElement(this.CurrentSlot), this.CurrentColumnIndex, true /*applyCellState*/);
}
}
else if (!dataGridWillReceiveRoutedEvent)
{
FrameworkElement focusedElement = focusedObject as FrameworkElement;
if (focusedElement != null)
{
focusedElement.LostFocus += new RoutedEventHandler(ExternalEditingElement_LostFocus);
}
}
}
}
private object GetFocusedElement()
{
if (TypeHelper.IsXamlRootAvailable && XamlRoot != null)
{
return FocusManager.GetFocusedElement(XamlRoot);
}
else
{
return FocusManager.GetFocusedElement();
}
}
private void DataGrid_PointerEntered(object sender, PointerRoutedEventArgs e)
{
if (e.Pointer.PointerDeviceType != PointerDeviceType.Touch)
{
// Mouse/Pen inputs dominate. If touch panning indicators are shown, switch to mouse indicators.
_preferMouseIndicators = true;
ShowScrollBars();
}
}
private void DataGrid_PointerExited(object sender, PointerRoutedEventArgs e)
{
if (e.Pointer.PointerDeviceType != PointerDeviceType.Touch)
{
// Mouse/Pen inputs dominate. If touch panning indicators are shown, switch to mouse indicators.
_isPointerOverHorizontalScrollBar = false;
_isPointerOverVerticalScrollBar = false;
_preferMouseIndicators = true;
ShowScrollBars();
HideScrollBarsAfterDelay();
}
}
private void DataGrid_PointerMoved(object sender, PointerRoutedEventArgs e)
{
// Don't process if this is a generated replay of the event.
if (e.IsGenerated)
{
return;
}
if (e.Pointer.PointerDeviceType != PointerDeviceType.Touch)
{
// Mouse/Pen inputs dominate. If touch panning indicators are shown, switch to mouse indicators.
_preferMouseIndicators = true;
ShowScrollBars();
if (!UISettingsHelper.AreSettingsEnablingAnimations &&
_hideScrollBarsTimer != null &&
(_isPointerOverHorizontalScrollBar || _isPointerOverVerticalScrollBar))
{
StopHideScrollBarsTimer();
}
}
}
private void DataGrid_PointerPressed(object sender, PointerRoutedEventArgs e)
{
if (e.Handled)
{
return;
}
// Show the scroll bars as soon as a pointer is pressed on the DataGrid.
ShowScrollBars();
}
private void DataGrid_PointerReleased(object sender, PointerRoutedEventArgs e)
{
if (this.CurrentColumnIndex != -1 && this.CurrentSlot != -1)
{
e.Handled = true;
}
}
private void DataGrid_Unloaded(object sender, RoutedEventArgs e)
{
_showingMouseIndicators = false;
_keepScrollBarsShowing = false;
}
#if FEATURE_VALIDATION
private void EditingElement_BindingValidationError(object sender, ValidationErrorEventArgs e)
{
if (e.Action == ValidationErrorEventAction.Added && e.Error.Exception != null && e.Error.ErrorContent != null)
{
ValidationResult validationResult = new ValidationResult(e.Error.ErrorContent.ToString(), new List<string>() { _updateSourcePath });
_bindingValidationResults.AddIfNew(validationResult);
}
}
#endif
private void EditingElement_Loaded(object sender, RoutedEventArgs e)
{
FrameworkElement element = sender as FrameworkElement;
if (element != null)
{
element.Loaded -= new RoutedEventHandler(EditingElement_Loaded);
}
PreparingCellForEditPrivate(element);
}
private bool EndCellEdit(DataGridEditAction editAction, bool exitEditingMode, bool keepFocus, bool raiseEvents)
{
if (_editingColumnIndex == -1)
{
return true;
}
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Slot == this.CurrentSlot, "Expected EditingRow.Slot equals CurrentSlot.");
DiagnosticsDebug.Assert(_editingColumnIndex >= 0, "Expected positive _editingColumnIndex.");
DiagnosticsDebug.Assert(_editingColumnIndex < this.ColumnsItemsInternal.Count, "Expected _editingColumnIndex smaller than this.ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(_editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals this.CurrentColumnIndex.");
// Cache these to see if they change later
int currentSlot = this.CurrentSlot;
int currentColumnIndex = this.CurrentColumnIndex;
// We're ready to start ending, so raise the event
DataGridCell editingCell = this.EditingRow.Cells[_editingColumnIndex];
FrameworkElement editingElement = editingCell.Content as FrameworkElement;
if (editingElement == null)
{
return false;
}
if (raiseEvents)
{
DataGridCellEditEndingEventArgs e = new DataGridCellEditEndingEventArgs(this.CurrentColumn, this.EditingRow, editingElement, editAction);
OnCellEditEnding(e);
if (e.Cancel)
{
// CellEditEnding has been canceled
return false;
}
// Ensure that the current cell wasn't changed in the user's CellEditEnding handler
if (_editingColumnIndex == -1 ||
currentSlot != this.CurrentSlot ||
currentColumnIndex != this.CurrentColumnIndex)
{
return true;
}
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Slot == currentSlot, "Expected EditingRow.Slot equals currentSlot.");
DiagnosticsDebug.Assert(_editingColumnIndex != -1, "Expected _editingColumnIndex other than -1.");
DiagnosticsDebug.Assert(_editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals CurrentColumnIndex.");
}
_bindingValidationResults.Clear();
// If we're canceling, let the editing column repopulate its old value if it wants
if (editAction == DataGridEditAction.Cancel)
{
this.CurrentColumn.CancelCellEditInternal(editingElement, _uneditedValue);
// Ensure that the current cell wasn't changed in the user column's CancelCellEdit
if (_editingColumnIndex == -1 ||
currentSlot != this.CurrentSlot ||
currentColumnIndex != this.CurrentColumnIndex)
{
return true;
}
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Slot == currentSlot, "Expected EditingRow.Slot equals currentSlot.");
DiagnosticsDebug.Assert(_editingColumnIndex != -1, "Expected _editingColumnIndex other than -1.");
DiagnosticsDebug.Assert(_editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals CurrentColumnIndex.");
// Re-validate
this.ValidateEditingRow(true /*scrollIntoView*/, false /*wireEvents*/);
}
// If we're committing, explicitly update the source but watch out for any validation errors
if (editAction == DataGridEditAction.Commit)
{
foreach (BindingInfo bindingData in this.CurrentColumn.GetInputBindings(editingElement, this.CurrentItem))
{
DiagnosticsDebug.Assert(bindingData.BindingExpression.ParentBinding != null, "Expected non-null bindingData.BindingExpression.ParentBinding.");
_updateSourcePath = bindingData.BindingExpression.ParentBinding.Path != null ? bindingData.BindingExpression.ParentBinding.Path.Path : null;
#if FEATURE_VALIDATION
bindingData.Element.BindingValidationError += new EventHandler<ValidationErrorEventArgs>(EditingElement_BindingValidationError);
#endif
try
{
bindingData.BindingExpression.UpdateSource();
}
finally
{
#if FEATURE_VALIDATION
bindingData.Element.BindingValidationError -= new EventHandler<ValidationErrorEventArgs>(EditingElement_BindingValidationError);
#endif
}
}
// Re-validate
this.ValidateEditingRow(true /*scrollIntoView*/, false /*wireEvents*/);
if (_bindingValidationResults.Count > 0)
{
ScrollSlotIntoView(this.CurrentColumnIndex, this.CurrentSlot, false /*forCurrentCellChange*/, true /*forceHorizontalScroll*/);
return false;
}
}
if (exitEditingMode)
{
_editingColumnIndex = -1;
editingCell.ApplyCellState(true /*animate*/);
// TODO: Figure out if we should restore a cached this.IsTabStop.
this.IsTabStop = true;
if (keepFocus && editingElement.ContainsFocusedElement(this))
{
this.Focus(FocusState.Programmatic);
}
PopulateCellContent(!exitEditingMode /*isCellEdited*/, this.CurrentColumn, this.EditingRow, editingCell);
}
// We're done, so raise the CellEditEnded event
if (raiseEvents)
{
OnCellEditEnded(new DataGridCellEditEndedEventArgs(this.CurrentColumn, this.EditingRow, editAction));
}
// There's a chance that somebody reopened this cell for edit within the CellEditEnded handler,
// so we should return false if we were supposed to exit editing mode, but we didn't
return !(exitEditingMode && currentColumnIndex == _editingColumnIndex);
}
private bool EndRowEdit(DataGridEditAction editAction, bool exitEditingMode, bool raiseEvents)
{
// Explicit row end edit has been triggered, so we can no longer be initializing a new item.
_initializingNewItem = false;
if (this.EditingRow == null || this.DataConnection.EndingEdit)
{
return true;
}
if (_editingColumnIndex != -1 || (editAction == DataGridEditAction.Cancel && raiseEvents &&
!(this.DataConnection.CanCancelEdit || this.EditingRow.DataContext is IEditableObject || this.DataConnection.IsAddingNew)))
{
// Ending the row edit will fail immediately under the following conditions:
// 1. We haven't ended the cell edit yet.
// 2. We're trying to cancel edit when the underlying DataType is not an IEditableObject,
// because we have no way to properly restore the old value. We will only allow this to occur if:
// a. raiseEvents == false, which means we're internally forcing a cancel or
// b. we're canceling a new item addition.
return false;
}
DataGridRow editingRow = this.EditingRow;
if (raiseEvents)
{
DataGridRowEditEndingEventArgs e = new DataGridRowEditEndingEventArgs(this.EditingRow, editAction);
OnRowEditEnding(e);
if (e.Cancel)
{
// RowEditEnding has been canceled
return false;
}
// Editing states might have been changed in the RowEditEnding handlers
if (_editingColumnIndex != -1)
{
return false;
}
if (editingRow != this.EditingRow)
{
return true;
}
}
// Call the appropriate commit or cancel methods
if (editAction == DataGridEditAction.Commit)
{
if (!CommitRowEdit(exitEditingMode))
{
return false;
}
}
else
{
if (!CancelRowEdit(exitEditingMode) && raiseEvents)
{
// We failed to cancel edit so we should abort unless we're forcing a cancel
return false;
}
}
ResetValidationStatus();
// Update the previously edited row's state
if (exitEditingMode && editingRow == this.EditingRow)
{
// Unwire the INDEI event handlers
foreach (INotifyDataErrorInfo indei in _validationItems.Keys)
{
indei.ErrorsChanged -= new EventHandler<DataErrorsChangedEventArgs>(ValidationItem_ErrorsChanged);
}
_validationItems.Clear();
this.RemoveEditingElements();
ResetEditingRow();
}
if (this.CurrentSlot == -1 && this.DataConnection.CollectionView != null && this.DataConnection.CollectionView.CurrentItem != null)
{
// Some EditableCollectionViews (ListCollectionView in particular) do not raise CurrentChanged when CommitEdit
// changes the position of the CurrentItem. Instead, they raise a PropertyChanged event for PositionChanged.
// We recognize that case here and setup the CurrentItem again if one exists but it was removed and re-added
// during Commit. This is better than reacting to PositionChanged which would double the work in most cases
// and likely introduce regressions.
UpdateStateOnCurrentChanged(this.DataConnection.CollectionView.CurrentItem, this.DataConnection.CollectionView.CurrentPosition);
}
// Raise the RowEditEnded event
if (raiseEvents)
{
OnRowEditEnded(new DataGridRowEditEndedEventArgs(editingRow, editAction));
}
return true;
}
private void EnsureColumnHeadersVisibility()
{
if (_columnHeadersPresenter != null)
{
_columnHeadersPresenter.Visibility = this.AreColumnHeadersVisible ? Visibility.Visible : Visibility.Collapsed;
}
}
private void EnsureVerticalGridLines()
{
if (this.AreColumnHeadersVisible)
{
double totalColumnsWidth = 0;
foreach (DataGridColumn column in this.ColumnsInternal)
{
totalColumnsWidth += column.ActualWidth;
column.HeaderCell.SeparatorVisibility = (column != this.ColumnsInternal.LastVisibleColumn || totalColumnsWidth < this.CellsWidth) ?
Visibility.Visible : Visibility.Collapsed;
}
}
foreach (DataGridRow row in GetAllRows())
{
row.EnsureGridLines();
}
}
/// <summary>
/// Exits editing mode without trying to commit or revert the editing, and
/// without repopulating the edited row's cell.
/// </summary>
private void ExitEdit(bool keepFocus)
{
// We're exiting editing mode, so we can no longer be initializing a new item.
_initializingNewItem = false;
if (this.EditingRow == null || this.DataConnection.EndingEdit)
{
DiagnosticsDebug.Assert(_editingColumnIndex == -1, "Expected _editingColumnIndex equal to -1.");
return;
}
if (_editingColumnIndex != -1)
{
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Slot == this.CurrentSlot, "Expected EditingRow.Slot equals CurrentSlot.");
DiagnosticsDebug.Assert(_editingColumnIndex >= 0, "Expected positive _editingColumnIndex.");
DiagnosticsDebug.Assert(_editingColumnIndex < this.ColumnsItemsInternal.Count, "Expected _editingColumnIndex smaller than this.ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(_editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals CurrentColumnIndex.");
_editingColumnIndex = -1;
this.EditingRow.Cells[this.CurrentColumnIndex].ApplyCellState(false /*animate*/);
}
// TODO: Figure out if we should restore a cached this.IsTabStop.
this.IsTabStop = true;
if (this.IsSlotVisible(this.EditingRow.Slot))
{
this.EditingRow.ApplyState(true /*animate*/);
}
ResetEditingRow();
if (keepFocus)
{
bool success = Focus(FocusState.Programmatic);
DiagnosticsDebug.Assert(success, "Expected successful Focus call.");
}
}
private void ExternalEditingElement_LostFocus(object sender, RoutedEventArgs e)
{
FrameworkElement element = sender as FrameworkElement;
if (element != null)
{
element.LostFocus -= new RoutedEventHandler(ExternalEditingElement_LostFocus);
DataGrid_LostFocus(sender, e);
}
}
private void FlushCurrentCellChanged()
{
if (_makeFirstDisplayedCellCurrentCellPending)
{
return;
}
if (this.SelectionHasChanged)
{
// selection is changing, don't raise CurrentCellChanged until it's done
_flushCurrentCellChanged = true;
FlushSelectionChanged();
return;
}
// We don't want to expand all intermediate currency positions, so we only expand
// the last current item before we flush the event
if (_collapsedSlotsTable.Contains(this.CurrentSlot) && this.CurrentSlot != this.SlotFromRowIndex(this.DataConnection.NewItemPlaceholderIndex))
{
DataGridRowGroupInfo rowGroupInfo = this.RowGroupHeadersTable.GetValueAt(this.RowGroupHeadersTable.GetPreviousIndex(this.CurrentSlot));
DiagnosticsDebug.Assert(rowGroupInfo != null, "Expected non-null rowGroupInfo.");
if (rowGroupInfo != null)
{
this.ExpandRowGroupParentChain(rowGroupInfo.Level, rowGroupInfo.Slot);
}
}
if (this.CurrentColumn != _previousCurrentColumn || this.CurrentItem != _previousCurrentItem)
{
this.CoerceSelectedItem();
_previousCurrentColumn = this.CurrentColumn;
_previousCurrentItem = this.CurrentItem;
OnCurrentCellChanged(EventArgs.Empty);
}
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null && this.CurrentCellCoordinates != _previousAutomationFocusCoordinates)
{
_previousAutomationFocusCoordinates = new DataGridCellCoordinates(this.CurrentCellCoordinates);
// If the DataGrid itself has focus, we want to move automation focus to the new current element
object focusedObject = GetFocusedElement();
if (focusedObject == this && AutomationPeer.ListenerExists(AutomationEvents.AutomationFocusChanged))
{
peer.RaiseAutomationFocusChangedEvent(this.CurrentSlot, this.CurrentColumnIndex);
}
}
_flushCurrentCellChanged = false;
}
private void FlushSelectionChanged()
{
if (this.SelectionHasChanged && _noSelectionChangeCount == 0 && !_makeFirstDisplayedCellCurrentCellPending)
{
this.CoerceSelectedItem();
if (this.NoCurrentCellChangeCount != 0)
{
// current cell is changing, don't raise SelectionChanged until it's done
return;
}
this.SelectionHasChanged = false;
if (_flushCurrentCellChanged)
{
FlushCurrentCellChanged();
}
SelectionChangedEventArgs e = _selectedItems.GetSelectionChangedEventArgs();
if (e.AddedItems.Count > 0 || e.RemovedItems.Count > 0)
{
OnSelectionChanged(e);
}
}
}
private bool FocusEditingCell(bool setFocus)
{
DiagnosticsDebug.Assert(this.CurrentColumnIndex >= 0, "Expected positive CurrentColumnIndex.");
DiagnosticsDebug.Assert(this.CurrentColumnIndex < this.ColumnsItemsInternal.Count, "Expected CurrentColumnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.CurrentSlot >= -1, "Expected CurrentSlot greater than or equal to -1.");
DiagnosticsDebug.Assert(this.CurrentSlot < this.SlotCount, "Expected CurrentSlot smaller than SlotCount.");
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Slot == this.CurrentSlot, "Expected EditingRow.Slot equals CurrentSlot.");
DiagnosticsDebug.Assert(_editingColumnIndex != -1, "Expected _editingColumnIndex other than -1.");
// TODO: Figure out if we should cache this.IsTabStop in order to restore
// it later instead of setting it back to true unconditionally.
this.IsTabStop = false;
_focusEditingControl = false;
bool success = false;
DataGridCell dataGridCell = this.EditingRow.Cells[_editingColumnIndex];
if (setFocus)
{
if (dataGridCell.ContainsFocusedElement(this))
{
success = true;
}
else
{
success = dataGridCell.Focus(FocusState.Programmatic);
}
_focusEditingControl = !success;
}
return success;
}
/// <summary>
/// This method formats a row (specified by a DataGridRowClipboardEventArgs) into
/// a single string to be added to the Clipboard when the DataGrid is copying its contents.
/// </summary>
/// <param name="e">DataGridRowClipboardEventArgs</param>
/// <returns>The formatted string.</returns>
private string FormatClipboardContent(DataGridRowClipboardEventArgs e)
{
StringBuilder text = new StringBuilder();
for (int cellIndex = 0; cellIndex < e.ClipboardRowContent.Count; cellIndex++)
{
DataGridClipboardCellContent cellContent = e.ClipboardRowContent[cellIndex];
if (cellContent != null)
{
text.Append(cellContent.Content);
}
if (cellIndex < e.ClipboardRowContent.Count - 1)
{
text.Append('\t');
}
else
{
text.Append('\r');
text.Append('\n');
}
}
return text.ToString();
}
// Calculates the amount to scroll for the ScrollLeft button
// This is a method rather than a property to emphasize a calculation
private double GetHorizontalSmallScrollDecrease()
{
// If the first column is covered up, scroll to the start of it when the user clicks the left button
if (_negHorizontalOffset > 0)
{
return _negHorizontalOffset;
}
else
{
// The entire first column is displayed, show the entire previous column when the user clicks
// the left button
DataGridColumn previousColumn = this.ColumnsInternal.GetPreviousVisibleScrollingColumn(
this.ColumnsItemsInternal[DisplayData.FirstDisplayedScrollingCol]);
if (previousColumn != null)
{
return GetEdgedColumnWidth(previousColumn);
}
else
{
// There's no previous column so don't move
return 0;
}
}
}
// Calculates the amount to scroll for the ScrollRight button
// This is a method rather than a property to emphasize a calculation
private double GetHorizontalSmallScrollIncrease()
{
if (this.DisplayData.FirstDisplayedScrollingCol >= 0)
{
return GetEdgedColumnWidth(this.ColumnsItemsInternal[DisplayData.FirstDisplayedScrollingCol]) - _negHorizontalOffset;
}
return 0;
}
// Calculates the amount the ScrollDown button should scroll
// This is a method rather than a property to emphasize that calculations are taking place
private double GetVerticalSmallScrollIncrease()
{
if (this.DisplayData.FirstScrollingSlot >= 0)
{
return GetExactSlotElementHeight(this.DisplayData.FirstScrollingSlot) - this.NegVerticalOffset;
}
return 0;
}
private void HideScrollBars(bool useTransitions)
{
if (!_keepScrollBarsShowing)
{
_proposedScrollBarsState = ScrollBarVisualState.NoIndicator;
_proposedScrollBarsSeparatorState = UISettingsHelper.AreSettingsEnablingAnimations ? ScrollBarsSeparatorVisualState.SeparatorCollapsed : ScrollBarsSeparatorVisualState.SeparatorCollapsedWithoutAnimation;
if (UISettingsHelper.AreSettingsAutoHidingScrollBars)
{
SwitchScrollBarsVisualStates(_proposedScrollBarsState, _proposedScrollBarsSeparatorState, useTransitions);
}
}
}
private void HideScrollBarsAfterDelay()
{
if (!_keepScrollBarsShowing)
{
DispatcherQueueTimer hideScrollBarsTimer = null;
if (_hideScrollBarsTimer != null)
{
hideScrollBarsTimer = _hideScrollBarsTimer;
if (hideScrollBarsTimer.IsRunning)
{
hideScrollBarsTimer.Stop();
}
}
else
{
hideScrollBarsTimer = DispatcherQueue.GetForCurrentThread().CreateTimer();
hideScrollBarsTimer.Interval = TimeSpan.FromMilliseconds(DATAGRID_noScrollBarCountdownMs);
hideScrollBarsTimer.Tick += HideScrollBarsTimerTick;
_hideScrollBarsTimer = hideScrollBarsTimer;
}
hideScrollBarsTimer.Start();
}
}
private void HideScrollBarsTimerTick(object sender, object e)
{
StopHideScrollBarsTimer();
HideScrollBars(true /*useTransitions*/);
}
private void HookDataGridEvents()
{
this.IsEnabledChanged += new DependencyPropertyChangedEventHandler(DataGrid_IsEnabledChanged);
this.KeyDown += new KeyEventHandler(DataGrid_KeyDown);
this.KeyUp += new KeyEventHandler(DataGrid_KeyUp);
this.GettingFocus += new TypedEventHandler<UIElement, GettingFocusEventArgs>(DataGrid_GettingFocus);
this.GotFocus += new RoutedEventHandler(DataGrid_GotFocus);
this.LostFocus += new RoutedEventHandler(DataGrid_LostFocus);
this.PointerEntered += new PointerEventHandler(DataGrid_PointerEntered);
this.PointerExited += new PointerEventHandler(DataGrid_PointerExited);
this.PointerMoved += new PointerEventHandler(DataGrid_PointerMoved);
this.PointerPressed += new PointerEventHandler(DataGrid_PointerPressed);
this.PointerReleased += new PointerEventHandler(DataGrid_PointerReleased);
this.Unloaded += new RoutedEventHandler(DataGrid_Unloaded);
}
private void HookHorizontalScrollBarEvents()
{
if (_hScrollBar != null)
{
_hScrollBar.Scroll += new ScrollEventHandler(HorizontalScrollBar_Scroll);
_hScrollBar.PointerEntered += new PointerEventHandler(HorizontalScrollBar_PointerEntered);
_hScrollBar.PointerExited += new PointerEventHandler(HorizontalScrollBar_PointerExited);
}
}
private void HookVerticalScrollBarEvents()
{
if (_vScrollBar != null)
{
_vScrollBar.Scroll += new ScrollEventHandler(VerticalScrollBar_Scroll);
_vScrollBar.PointerEntered += new PointerEventHandler(VerticalScrollBar_PointerEntered);
_vScrollBar.PointerExited += new PointerEventHandler(VerticalScrollBar_PointerExited);
}
}
private void HorizontalScrollBar_PointerEntered(object sender, PointerRoutedEventArgs e)
{
_isPointerOverHorizontalScrollBar = true;
if (!UISettingsHelper.AreSettingsEnablingAnimations)
{
HideScrollBarsAfterDelay();
}
}
private void HorizontalScrollBar_PointerExited(object sender, PointerRoutedEventArgs e)
{
_isPointerOverHorizontalScrollBar = false;
HideScrollBarsAfterDelay();
}
private void VerticalScrollBar_PointerEntered(object sender, PointerRoutedEventArgs e)
{
_isPointerOverVerticalScrollBar = true;
if (!UISettingsHelper.AreSettingsEnablingAnimations)
{
HideScrollBarsAfterDelay();
}
}
private void VerticalScrollBar_PointerExited(object sender, PointerRoutedEventArgs e)
{
_isPointerOverVerticalScrollBar = false;
HideScrollBarsAfterDelay();
}
private void HorizontalScrollBar_Scroll(object sender, ScrollEventArgs e)
{
ProcessHorizontalScroll(e.ScrollEventType);
}
private void IndicatorStateStoryboard_Completed(object sender, object e)
{
// If the cursor is currently directly over either scroll bar then do not automatically hide the indicators.
if (!_keepScrollBarsShowing &&
!_isPointerOverVerticalScrollBar &&
!_isPointerOverHorizontalScrollBar)
{
// Go to the NoIndicator state using transitions.
if (UISettingsHelper.AreSettingsEnablingAnimations)
{
// By default there is a delay before the NoIndicator state actually shows.
HideScrollBars(true /*useTransitions*/);
}
else
{
// Since OS animations are turned off, use a timer to delay the scroll bars' hiding.
HideScrollBarsAfterDelay();
}
}
}
private bool IsColumnOutOfBounds(int columnIndex)
{
return columnIndex >= this.ColumnsItemsInternal.Count || columnIndex < 0;
}
private bool IsInnerCellOutOfBounds(int columnIndex, int slot)
{
return IsColumnOutOfBounds(columnIndex) || IsSlotOutOfBounds(slot);
}
private bool IsInnerCellOutOfSelectionBounds(int columnIndex, int slot)
{
return IsColumnOutOfBounds(columnIndex) || IsSlotOutOfSelectionBounds(slot);
}
private bool IsSlotOutOfBounds(int slot)
{
return slot >= this.SlotCount || slot < -1 || _collapsedSlotsTable.Contains(slot);
}
private bool IsSlotOutOfSelectionBounds(int slot)
{
if (this.RowGroupHeadersTable.Contains(slot))
{
DiagnosticsDebug.Assert(slot >= 0, "Expected positive slot.");
DiagnosticsDebug.Assert(slot < this.SlotCount, "Expected slot smaller than this.SlotCount.");
return false;
}
else
{
int rowIndex = RowIndexFromSlot(slot);
return rowIndex < 0 || rowIndex >= this.DataConnection.Count;
}
}
private void LoadMoreDataFromIncrementalItemsSource(double totalVisibleHeight)
{
if (IncrementalLoadingTrigger == IncrementalLoadingTrigger.Edge && DataConnection.IsDataSourceIncremental && DataConnection.HasMoreItems && !DataConnection.IsLoadingMoreItems)
{
var bottomScrolledOffHeight = Math.Max(0, totalVisibleHeight - CellsHeight - VerticalOffset);
if ((IncrementalLoadingThreshold * CellsHeight) >= bottomScrolledOffHeight)
{
var numberOfRowsToLoad = Math.Max(1, (int)(DataFetchSize * CellsHeight / RowHeightEstimate));
DataConnection.LoadMoreItems((uint)numberOfRowsToLoad);
}
}
}
private void MakeFirstDisplayedCellCurrentCell()
{
if (this.CurrentColumnIndex != -1)
{
_makeFirstDisplayedCellCurrentCellPending = false;
_desiredCurrentColumnIndex = -1;
this.FlushCurrentCellChanged();
return;
}
if (this.SlotCount != SlotFromRowIndex(this.DataConnection.Count))
{
_makeFirstDisplayedCellCurrentCellPending = true;
return;
}
// No current cell, therefore no selection either - try to set the current cell to the
// ItemsSource's ICollectionView.CurrentItem if it exists, otherwise use the first displayed cell.
int slot;
if (this.DataConnection.CollectionView != null)
{
if (this.DataConnection.CollectionView.IsCurrentBeforeFirst ||
this.DataConnection.CollectionView.IsCurrentAfterLast)
{
slot = this.RowGroupHeadersTable.Contains(0) ? 0 : -1;
}
else
{
slot = SlotFromRowIndex(this.DataConnection.CollectionView.CurrentPosition);
}
}
else
{
if (this.SelectedIndex == -1)
{
// Try to default to the first row
slot = SlotFromRowIndex(0);
if (!this.IsSlotVisible(slot))
{
slot = -1;
}
}
else
{
slot = SlotFromRowIndex(this.SelectedIndex);
}
}
int columnIndex = this.FirstDisplayedNonFillerColumnIndex;
if (_desiredCurrentColumnIndex >= 0 && _desiredCurrentColumnIndex < this.ColumnsItemsInternal.Count)
{
columnIndex = _desiredCurrentColumnIndex;
}
SetAndSelectCurrentCell(
columnIndex,
slot,
false /*forceCurrentCellSelection*/);
this.AnchorSlot = slot;
_makeFirstDisplayedCellCurrentCellPending = false;
_desiredCurrentColumnIndex = -1;
FlushCurrentCellChanged();
}
private void NoIndicatorStateStoryboard_Completed(object sender, object e)
{
DiagnosticsDebug.Assert(_hasNoIndicatorStateStoryboardCompletedHandler, "Expected _hasNoIndicatorStateStoryboardCompletedHandler is true.");
_showingMouseIndicators = false;
}
private void PopulateCellContent(
bool isCellEdited,
DataGridColumn dataGridColumn,
DataGridRow dataGridRow,
DataGridCell dataGridCell)
{
DiagnosticsDebug.Assert(dataGridColumn != null, "Expected non-null dataGridColumn.");
DiagnosticsDebug.Assert(dataGridRow != null, "Expected non-null dataGridRow.");
DiagnosticsDebug.Assert(dataGridCell != null, "Expected non-null dataGridCell.");
FrameworkElement element = null;
DataGridBoundColumn dataGridBoundColumn = dataGridColumn as DataGridBoundColumn;
if (isCellEdited)
{
// Generate EditingElement and apply column style if available
element = dataGridColumn.GenerateEditingElementInternal(dataGridCell, dataGridRow.DataContext);
if (element != null)
{
if (dataGridBoundColumn != null && dataGridBoundColumn.EditingElementStyle != null)
{
element.SetStyleWithType(dataGridBoundColumn.EditingElementStyle);
}
// Subscribe to the new element's events
element.Loaded += new RoutedEventHandler(EditingElement_Loaded);
}
}
else
{
// Generate Element and apply column style if available
element = dataGridColumn.GenerateElementInternal(dataGridCell, dataGridRow.DataContext);
if (element != null)
{
if (dataGridBoundColumn != null && dataGridBoundColumn.ElementStyle != null)
{
element.SetStyleWithType(dataGridBoundColumn.ElementStyle);
}
}
#if FEATURE_VALIDATION
// If we are replacing the editingElement on the cell with the displayElement, and there
// were validation errors present on the editingElement, we need to manually force the
// control to go to the InvalidUnfocused state to support Implicit Styles. The reason
// is because the editingElement is being removed as part of a keystroke, and it will
// leave the visual tree before its state is updated. Since Implicit Styles are
// disabled when an element is removed from the visual tree, the subsequent GoToState fails
// and the editingElement cannot make it to the InvalidUnfocused state. As a result,
// any popups in the InvalidFocused state would stay around incorrectly.
if (this.EditingRow != null && dataGridCell.Content != null)
{
Control control = dataGridCell.Content as Control;
if (control != null && Validation.GetHasError(control))
{
VisualStateManager.GoToState(control, VisualStates.StateInvalidUnfocused, useTransitions: false);
}
}
#endif
}
dataGridCell.Content = element;
}
private void PreparingCellForEditPrivate(FrameworkElement editingElement)
{
if (_editingColumnIndex == -1 ||
this.CurrentColumnIndex == -1 ||
this.EditingRow.Cells[this.CurrentColumnIndex].Content != editingElement)
{
// The current cell has changed since the call to BeginCellEdit, so the fact
// that this element has loaded is no longer relevant
return;
}
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
DiagnosticsDebug.Assert(this.EditingRow.Slot == this.CurrentSlot, "Expected EditingRow.Slot equals CurrentSlot.");
DiagnosticsDebug.Assert(_editingColumnIndex >= 0, "Expected positive _editingColumnIndex.");
DiagnosticsDebug.Assert(_editingColumnIndex < this.ColumnsItemsInternal.Count, "Expected _editingColumnIndex smaller than this.ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(_editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals CurrentColumnIndex.");
FocusEditingCell(this.ContainsFocus || _focusEditingControl /*setFocus*/);
// Prepare the cell for editing and raise the PreparingCellForEdit event for all columns
DataGridColumn dataGridColumn = this.CurrentColumn;
_uneditedValue = dataGridColumn.PrepareCellForEditInternal(editingElement, _editingEventArgs);
OnPreparingCellForEdit(new DataGridPreparingCellForEditEventArgs(dataGridColumn, this.EditingRow, _editingEventArgs, editingElement));
}
private bool ProcessAKey()
{
bool ctrl, shift, alt;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift, out alt);
if (ctrl && !shift && !alt && this.SelectionMode == DataGridSelectionMode.Extended)
{
SelectAll();
return true;
}
return false;
}
/// <summary>
/// Handles the case where a 'Copy' key ('C' or 'Insert') has been pressed. If pressed in combination with
/// the control key, and the necessary prerequisites are met, the DataGrid will copy its contents
/// to the Clipboard as text.
/// </summary>
/// <returns>Whether or not the DataGrid handled the key press.</returns>
private bool ProcessCopyKey()
{
bool ctrl, shift, alt;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift, out alt);
if (ctrl &&
!shift &&
!alt &&
this.ClipboardCopyMode != DataGridClipboardCopyMode.None &&
this.SelectedItems.Count > 0 &&
_editingColumnIndex != this.CurrentColumnIndex)
{
StringBuilder textBuilder = new StringBuilder();
if (this.ClipboardCopyMode == DataGridClipboardCopyMode.IncludeHeader)
{
DataGridRowClipboardEventArgs headerArgs = new DataGridRowClipboardEventArgs(null, true);
foreach (DataGridColumn column in this.ColumnsInternal.GetVisibleColumns())
{
headerArgs.ClipboardRowContent.Add(new DataGridClipboardCellContent(null, column, column.Header));
}
this.OnCopyingRowClipboardContent(headerArgs);
textBuilder.Append(FormatClipboardContent(headerArgs));
}
for (int index = 0; index < this.SelectedItems.Count; index++)
{
object item = this.SelectedItems[index];
DataGridRowClipboardEventArgs itemArgs = new DataGridRowClipboardEventArgs(item, false);
foreach (DataGridColumn column in this.ColumnsInternal.GetVisibleColumns())
{
object content = column.GetCellValue(item, column.ClipboardContentBinding);
itemArgs.ClipboardRowContent.Add(new DataGridClipboardCellContent(item, column, content));
}
this.OnCopyingRowClipboardContent(itemArgs);
textBuilder.Append(FormatClipboardContent(itemArgs));
}
string text = textBuilder.ToString();
if (!string.IsNullOrEmpty(text))
{
try
{
DataPackage content = new DataPackage();
content.SetText(text);
Clipboard.SetContent(content);
}
catch (SecurityException)
{
// We will get a SecurityException if the user does not allow access to the clipboard.
}
return true;
}
}
return false;
}
private bool ProcessDataGridKey(KeyRoutedEventArgs e)
{
bool focusDataGrid = false;
switch (e.Key)
{
case VirtualKey.Tab:
return ProcessTabKey(e);
case VirtualKey.Up:
focusDataGrid = ProcessUpKey();
break;
case VirtualKey.Down:
focusDataGrid = ProcessDownKey();
break;
case VirtualKey.PageDown:
focusDataGrid = ProcessNextKey();
break;
case VirtualKey.PageUp:
focusDataGrid = ProcessPriorKey();
break;
case VirtualKey.Left:
focusDataGrid = this.FlowDirection == FlowDirection.LeftToRight ? ProcessLeftKey() : ProcessRightKey();
break;
case VirtualKey.Right:
focusDataGrid = this.FlowDirection == FlowDirection.LeftToRight ? ProcessRightKey() : ProcessLeftKey();
break;
case VirtualKey.F2:
return ProcessF2Key(e);
case VirtualKey.Home:
focusDataGrid = ProcessHomeKey();
break;
case VirtualKey.End:
focusDataGrid = ProcessEndKey();
break;
case VirtualKey.Enter:
focusDataGrid = ProcessEnterKey();
break;
case VirtualKey.Escape:
return ProcessEscapeKey();
case VirtualKey.A:
return ProcessAKey();
case VirtualKey.C:
return ProcessCopyKey();
case VirtualKey.Insert:
return ProcessCopyKey();
case VirtualKey.Space:
return ProcessSpaceKey();
}
if (focusDataGrid && this.IsTabStop)
{
this.Focus(FocusState.Programmatic);
}
return focusDataGrid;
}
private bool ProcessDownKeyInternal(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.FirstVisibleColumn;
int firstVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
int lastSlot = this.LastVisibleSlot;
if (firstVisibleColumnIndex == -1 || lastSlot == -1)
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessDownKeyInternal(shift, ctrl); }))
{
return true;
}
int nextSlot = -1;
if (this.CurrentSlot != -1)
{
nextSlot = this.GetNextVisibleSlot(this.CurrentSlot);
if (nextSlot >= this.SlotCount)
{
nextSlot = -1;
}
}
_noSelectionChangeCount++;
try
{
int desiredSlot;
int columnIndex;
DataGridSelectionAction action;
if (this.ColumnHeaderHasFocus)
{
if (ctrl || shift)
{
return false;
}
if (this.CurrentSlot == this.FirstVisibleSlot)
{
this.ColumnHeaderHasFocus = false;
return true;
}
DiagnosticsDebug.Assert(this.CurrentColumnIndex != -1, "Expected CurrentColumnIndex other than -1.");
desiredSlot = this.FirstVisibleSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
else if (this.CurrentColumnIndex == -1)
{
desiredSlot = this.FirstVisibleSlot;
columnIndex = firstVisibleColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
else if (ctrl)
{
if (shift)
{
// Both Ctrl and Shift
desiredSlot = lastSlot;
columnIndex = this.CurrentColumnIndex;
action = (this.SelectionMode == DataGridSelectionMode.Extended)
? DataGridSelectionAction.SelectFromAnchorToCurrent
: DataGridSelectionAction.SelectCurrent;
}
else
{
// Ctrl without Shift
desiredSlot = lastSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
}
else
{
if (nextSlot == -1)
{
return true;
}
if (shift)
{
// Shift without Ctrl
desiredSlot = nextSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectFromAnchorToCurrent;
}
else
{
// Neither Ctrl nor Shift
desiredSlot = nextSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
}
UpdateSelectionAndCurrency(columnIndex, desiredSlot, action, true /*scrollIntoView*/);
}
finally
{
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
private bool ProcessEndKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.LastVisibleColumn;
int lastVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
int firstVisibleSlot = this.FirstVisibleSlot;
int lastVisibleSlot = this.LastVisibleSlot;
if (lastVisibleColumnIndex == -1 || (firstVisibleSlot == -1 && !this.ColumnHeaderHasFocus))
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessEndKey(shift, ctrl); }))
{
return true;
}
_noSelectionChangeCount++;
try
{
if (!ctrl)
{
return ProcessRightMost(lastVisibleColumnIndex, firstVisibleSlot);
}
else if (firstVisibleSlot != -1)
{
DataGridSelectionAction action = (shift && this.SelectionMode == DataGridSelectionMode.Extended)
? DataGridSelectionAction.SelectFromAnchorToCurrent
: DataGridSelectionAction.SelectCurrent;
UpdateSelectionAndCurrency(lastVisibleColumnIndex, lastVisibleSlot, action, true /*scrollIntoView*/);
}
}
finally
{
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
private bool ProcessEnterKey(bool shift, bool ctrl)
{
int oldCurrentSlot = this.CurrentSlot;
if (!ctrl)
{
if (this.ColumnHeaderHasFocus)
{
this.CurrentColumn.HeaderCell.InvokeProcessSort();
return true;
}
else if (this.FirstVisibleSlot != -1 && this.RowGroupHeadersTable.Contains(this.CurrentSlot) && ToggleRowGroup())
{
return true;
}
// If Enter was used by a TextBox, we shouldn't handle the key
TextBox focusedTextBox = GetFocusedElement() as TextBox;
if (focusedTextBox != null && focusedTextBox.AcceptsReturn)
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessEnterKey(shift, ctrl); }))
{
return true;
}
// Enter behaves like down arrow - it commits the potential editing and goes down one cell.
if (!ProcessDownKeyInternal(false, ctrl))
{
return false;
}
}
else if (this.WaitForLostFocus(() => { this.ProcessEnterKey(shift, ctrl); }))
{
return true;
}
// Try to commit the potential editing
if (oldCurrentSlot == this.CurrentSlot && EndCellEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, true /*keepFocus*/, true /*raiseEvents*/) && this.EditingRow != null)
{
EndRowEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, true /*raiseEvents*/);
ScrollIntoView(this.CurrentItem, this.CurrentColumn);
}
return true;
}
private bool ProcessEscapeKey()
{
if (this.WaitForLostFocus(() => { this.ProcessEscapeKey(); }))
{
return true;
}
if (_editingColumnIndex != -1)
{
// Revert the potential cell editing and exit cell editing.
EndCellEdit(DataGridEditAction.Cancel, true /*exitEditingMode*/, true /*keepFocus*/, true /*raiseEvents*/);
return true;
}
else if (this.EditingRow != null)
{
// Revert the potential row editing and exit row editing.
EndRowEdit(DataGridEditAction.Cancel, true /*exitEditingMode*/, true /*raiseEvents*/);
return true;
}
return false;
}
private bool ProcessF2Key(KeyRoutedEventArgs e)
{
bool ctrl, shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
if (!shift && !ctrl &&
_editingColumnIndex == -1 && this.CurrentColumnIndex != -1 && GetRowSelection(this.CurrentSlot) &&
!GetColumnEffectiveReadOnlyState(this.CurrentColumn))
{
if (ScrollSlotIntoView(this.CurrentColumnIndex, this.CurrentSlot, false /*forCurrentCellChange*/, true /*forceHorizontalScroll*/))
{
BeginCellEdit(e);
}
return true;
}
return false;
}
private bool ProcessHomeKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.FirstVisibleNonFillerColumn;
int firstVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
int firstVisibleSlot = this.FirstVisibleSlot;
if (firstVisibleColumnIndex == -1 || (firstVisibleSlot == -1 && !this.ColumnHeaderHasFocus))
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessHomeKey(shift, ctrl); }))
{
return true;
}
_noSelectionChangeCount++;
try
{
if (!ctrl)
{
return ProcessLeftMost(firstVisibleColumnIndex, firstVisibleSlot);
}
else if (firstVisibleSlot != -1)
{
DataGridSelectionAction action = (shift && this.SelectionMode == DataGridSelectionMode.Extended)
? DataGridSelectionAction.SelectFromAnchorToCurrent
: DataGridSelectionAction.SelectCurrent;
UpdateSelectionAndCurrency(firstVisibleColumnIndex, firstVisibleSlot, action, true /*scrollIntoView*/);
}
}
finally
{
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
private bool ProcessLeftKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.FirstVisibleNonFillerColumn;
int firstVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
int firstVisibleSlot = this.FirstVisibleSlot;
if (firstVisibleColumnIndex == -1 || (firstVisibleSlot == -1 && !this.ColumnHeaderHasFocus))
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessLeftKey(shift, ctrl); }))
{
return true;
}
int previousVisibleColumnIndex = -1;
if (this.CurrentColumnIndex != -1)
{
dataGridColumn = this.ColumnsInternal.GetPreviousVisibleNonFillerColumn(this.ColumnsItemsInternal[this.CurrentColumnIndex]);
if (dataGridColumn != null)
{
previousVisibleColumnIndex = dataGridColumn.Index;
}
}
DataGridColumn oldFocusedColumn = this.FocusedColumn;
_noSelectionChangeCount++;
try
{
if (ctrl)
{
return ProcessLeftMost(firstVisibleColumnIndex, firstVisibleSlot);
}
else if (firstVisibleSlot != -1 && (!this.RowGroupHeadersTable.Contains(this.CurrentSlot) || this.ColumnHeaderHasFocus))
{
if (this.CurrentColumnIndex == -1)
{
UpdateSelectionAndCurrency(firstVisibleColumnIndex, firstVisibleSlot, DataGridSelectionAction.SelectCurrent, true /*scrollIntoView*/);
}
else
{
if (previousVisibleColumnIndex == -1)
{
return true;
}
_noFocusedColumnChangeCount++;
try
{
UpdateSelectionAndCurrency(previousVisibleColumnIndex, this.CurrentSlot, DataGridSelectionAction.None, true /*scrollIntoView*/);
}
finally
{
_noFocusedColumnChangeCount--;
}
}
}
}
finally
{
this.NoSelectionChangeCount--;
}
if (this.ColumnHeaderHasFocus)
{
if (this.CurrentColumn == null)
{
dataGridColumn = this.ColumnsInternal.GetPreviousVisibleNonFillerColumn(this.FocusedColumn);
if (dataGridColumn != null)
{
this.FocusedColumn = dataGridColumn;
}
}
else
{
this.FocusedColumn = this.CurrentColumn;
}
if (firstVisibleSlot == -1 && this.FocusedColumn != null)
{
ScrollColumnIntoView(this.FocusedColumn.Index);
}
}
bool focusedColumnChanged = this.ColumnHeaderHasFocus && oldFocusedColumn != this.FocusedColumn;
if (focusedColumnChanged)
{
if (oldFocusedColumn != null && oldFocusedColumn.HasHeaderCell)
{
oldFocusedColumn.HeaderCell.ApplyState(true);
}
if (this.FocusedColumn != null && this.FocusedColumn.HasHeaderCell)
{
this.FocusedColumn.HeaderCell.ApplyState(true);
}
}
return focusedColumnChanged || _successfullyUpdatedSelection;
}
// Ctrl Left <==> Home
private bool ProcessLeftMost(int firstVisibleColumnIndex, int firstVisibleSlot)
{
DataGridColumn oldFocusedColumn = this.FocusedColumn;
_noSelectionChangeCount++;
try
{
int desiredSlot;
DataGridSelectionAction action;
if (this.CurrentColumnIndex == -1)
{
desiredSlot = firstVisibleSlot;
action = DataGridSelectionAction.SelectCurrent;
DiagnosticsDebug.Assert(_selectedItems.Count == 0, "Expected _selectedItems.Count equals 0.");
}
else
{
desiredSlot = this.CurrentSlot;
action = DataGridSelectionAction.None;
}
_noFocusedColumnChangeCount++;
try
{
UpdateSelectionAndCurrency(firstVisibleColumnIndex, desiredSlot, action, true /*scrollIntoView*/);
}
finally
{
_noFocusedColumnChangeCount--;
}
}
finally
{
this.NoSelectionChangeCount--;
}
if (this.ColumnHeaderHasFocus)
{
if (this.CurrentColumn == null)
{
this.FocusedColumn = this.ColumnsInternal.FirstVisibleColumn;
}
else
{
this.FocusedColumn = this.CurrentColumn;
}
if (firstVisibleSlot == -1 && this.FocusedColumn != null)
{
ScrollColumnIntoView(this.FocusedColumn.Index);
}
}
bool focusedColumnChanged = this.ColumnHeaderHasFocus && oldFocusedColumn != this.FocusedColumn;
if (focusedColumnChanged)
{
if (oldFocusedColumn != null && oldFocusedColumn.HasHeaderCell)
{
oldFocusedColumn.HeaderCell.ApplyState(true);
}
if (this.FocusedColumn != null && this.FocusedColumn.HasHeaderCell)
{
this.FocusedColumn.HeaderCell.ApplyState(true);
}
}
return focusedColumnChanged || _successfullyUpdatedSelection;
}
private bool ProcessNextKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.FirstVisibleNonFillerColumn;
int firstVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
if (firstVisibleColumnIndex == -1 || this.DisplayData.FirstScrollingSlot == -1)
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessNextKey(shift, ctrl); }))
{
return true;
}
int nextPageSlot = this.CurrentSlot == -1 ? this.DisplayData.FirstScrollingSlot : this.CurrentSlot;
DiagnosticsDebug.Assert(nextPageSlot != -1, "Expected nextPageSlot other than -1.");
int slot = GetNextVisibleSlot(nextPageSlot);
int scrollCount = this.DisplayData.NumTotallyDisplayedScrollingElements;
while (scrollCount > 0 && slot < this.SlotCount)
{
nextPageSlot = slot;
scrollCount--;
slot = GetNextVisibleSlot(slot);
}
_noSelectionChangeCount++;
try
{
DataGridSelectionAction action;
int columnIndex;
if (this.CurrentColumnIndex == -1)
{
columnIndex = firstVisibleColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
else
{
columnIndex = this.CurrentColumnIndex;
action = (shift && this.SelectionMode == DataGridSelectionMode.Extended)
? action = DataGridSelectionAction.SelectFromAnchorToCurrent
: action = DataGridSelectionAction.SelectCurrent;
}
UpdateSelectionAndCurrency(columnIndex, nextPageSlot, action, true /*scrollIntoView*/);
}
finally
{
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
private bool ProcessPriorKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.FirstVisibleNonFillerColumn;
int firstVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
if (firstVisibleColumnIndex == -1 || this.DisplayData.FirstScrollingSlot == -1)
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessPriorKey(shift, ctrl); }))
{
return true;
}
int previousPageSlot = (this.CurrentSlot == -1) ? this.DisplayData.FirstScrollingSlot : this.CurrentSlot;
DiagnosticsDebug.Assert(previousPageSlot != -1, "Expected previousPageSlot other than -1.");
int scrollCount = this.DisplayData.NumTotallyDisplayedScrollingElements;
int slot = GetPreviousVisibleSlot(previousPageSlot);
while (scrollCount > 0 && slot != -1)
{
previousPageSlot = slot;
scrollCount--;
slot = GetPreviousVisibleSlot(slot);
}
DiagnosticsDebug.Assert(previousPageSlot != -1, "Expected previousPageSlot other than -1.");
_noSelectionChangeCount++;
try
{
int columnIndex;
DataGridSelectionAction action;
if (this.CurrentColumnIndex == -1)
{
columnIndex = firstVisibleColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
else
{
columnIndex = this.CurrentColumnIndex;
action = (shift && this.SelectionMode == DataGridSelectionMode.Extended)
? DataGridSelectionAction.SelectFromAnchorToCurrent
: DataGridSelectionAction.SelectCurrent;
}
UpdateSelectionAndCurrency(columnIndex, previousPageSlot, action, true /*scrollIntoView*/);
}
finally
{
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
private bool ProcessRightKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.LastVisibleColumn;
int lastVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
int firstVisibleSlot = this.FirstVisibleSlot;
if (lastVisibleColumnIndex == -1 || (firstVisibleSlot == -1 && !this.ColumnHeaderHasFocus))
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessRightKey(shift, ctrl); }))
{
return true;
}
int nextVisibleColumnIndex = -1;
if (this.CurrentColumnIndex != -1)
{
dataGridColumn = this.ColumnsInternal.GetNextVisibleColumn(this.ColumnsItemsInternal[this.CurrentColumnIndex]);
if (dataGridColumn != null)
{
nextVisibleColumnIndex = dataGridColumn.Index;
}
}
DataGridColumn oldFocusedColumn = this.FocusedColumn;
_noSelectionChangeCount++;
try
{
if (ctrl)
{
return ProcessRightMost(lastVisibleColumnIndex, firstVisibleSlot);
}
else if (firstVisibleSlot != -1 && (!this.RowGroupHeadersTable.Contains(this.CurrentSlot) || this.ColumnHeaderHasFocus))
{
if (this.CurrentColumnIndex == -1)
{
int firstVisibleColumnIndex = this.ColumnsInternal.FirstVisibleColumn == null ? -1 : this.ColumnsInternal.FirstVisibleColumn.Index;
UpdateSelectionAndCurrency(firstVisibleColumnIndex, firstVisibleSlot, DataGridSelectionAction.SelectCurrent, true /*scrollIntoView*/);
}
else
{
if (nextVisibleColumnIndex == -1)
{
return true;
}
_noFocusedColumnChangeCount++;
try
{
UpdateSelectionAndCurrency(nextVisibleColumnIndex, this.CurrentSlot, DataGridSelectionAction.None, true /*scrollIntoView*/);
}
finally
{
_noFocusedColumnChangeCount--;
}
}
}
}
finally
{
this.NoSelectionChangeCount--;
}
if (this.ColumnHeaderHasFocus)
{
if (this.CurrentColumn == null)
{
dataGridColumn = this.ColumnsInternal.GetNextVisibleColumn(this.FocusedColumn);
if (dataGridColumn != null)
{
this.FocusedColumn = dataGridColumn;
}
}
else
{
this.FocusedColumn = this.CurrentColumn;
}
if (firstVisibleSlot == -1 && this.FocusedColumn != null)
{
ScrollColumnIntoView(this.FocusedColumn.Index);
}
}
bool focusedColumnChanged = this.ColumnHeaderHasFocus && oldFocusedColumn != this.FocusedColumn;
if (focusedColumnChanged)
{
if (oldFocusedColumn != null && oldFocusedColumn.HasHeaderCell)
{
oldFocusedColumn.HeaderCell.ApplyState(true);
}
if (this.FocusedColumn != null && this.FocusedColumn.HasHeaderCell)
{
this.FocusedColumn.HeaderCell.ApplyState(true);
}
}
return focusedColumnChanged || _successfullyUpdatedSelection;
}
// Ctrl Right <==> End
private bool ProcessRightMost(int lastVisibleColumnIndex, int firstVisibleSlot)
{
DataGridColumn oldFocusedColumn = this.FocusedColumn;
_noSelectionChangeCount++;
try
{
int desiredSlot;
DataGridSelectionAction action;
if (this.CurrentColumnIndex == -1)
{
desiredSlot = firstVisibleSlot;
action = DataGridSelectionAction.SelectCurrent;
}
else
{
desiredSlot = this.CurrentSlot;
action = DataGridSelectionAction.None;
}
_noFocusedColumnChangeCount++;
try
{
UpdateSelectionAndCurrency(lastVisibleColumnIndex, desiredSlot, action, true /*scrollIntoView*/);
}
finally
{
_noFocusedColumnChangeCount--;
}
}
finally
{
this.NoSelectionChangeCount--;
}
if (this.ColumnHeaderHasFocus)
{
if (this.CurrentColumn == null)
{
this.FocusedColumn = this.ColumnsInternal.LastVisibleColumn;
}
else
{
this.FocusedColumn = this.CurrentColumn;
}
if (firstVisibleSlot == -1 && this.FocusedColumn != null)
{
ScrollColumnIntoView(this.FocusedColumn.Index);
}
}
bool focusedColumnChanged = this.ColumnHeaderHasFocus && oldFocusedColumn != this.FocusedColumn;
if (focusedColumnChanged)
{
if (oldFocusedColumn != null && oldFocusedColumn.HasHeaderCell)
{
oldFocusedColumn.HeaderCell.ApplyState(true);
}
if (this.FocusedColumn != null && this.FocusedColumn.HasHeaderCell)
{
this.FocusedColumn.HeaderCell.ApplyState(true);
}
}
return focusedColumnChanged || _successfullyUpdatedSelection;
}
private bool ProcessSpaceKey()
{
return ToggleRowGroup();
}
private bool ProcessTabKey(KeyRoutedEventArgs e)
{
bool ctrl, shift;
KeyboardHelper.GetMetaKeyState(out ctrl, out shift);
return this.ProcessTabKey(e, shift, ctrl);
}
private bool ProcessTabKey(KeyRoutedEventArgs e, bool shift, bool ctrl)
{
if (ctrl || _editingColumnIndex == -1 || this.IsReadOnly)
{
// Go to the next/previous control on the page or the column header when
// - Ctrl key is used
// - Potential current cell is not edited, or the datagrid is read-only.
if (!shift && this.ColumnHeaders != null && this.AreColumnHeadersVisible && !this.ColumnHeaderHasFocus)
{
// Show focus on the current column's header.
this.ColumnHeaderHasFocus = true;
return true;
}
else if (shift && this.ColumnHeaderHasFocus)
{
this.ColumnHeaderHasFocus = false;
return this.CurrentColumnIndex != -1;
}
this.ColumnHeaderHasFocus = false;
return false;
}
// Try to locate a writable cell before/after the current cell
DiagnosticsDebug.Assert(this.CurrentColumnIndex != -1, "Expected CurrentColumnIndex other than -1.");
DiagnosticsDebug.Assert(this.CurrentSlot != -1, "Expected CurrentSlot other than -1.");
int neighborVisibleWritableColumnIndex, neighborSlot;
DataGridColumn dataGridColumn;
if (shift)
{
dataGridColumn = this.ColumnsInternal.GetPreviousVisibleWritableColumn(this.ColumnsItemsInternal[this.CurrentColumnIndex]);
neighborSlot = GetPreviousVisibleSlot(this.CurrentSlot);
if (this.EditingRow != null)
{
while (neighborSlot != -1 && this.RowGroupHeadersTable.Contains(neighborSlot))
{
neighborSlot = GetPreviousVisibleSlot(neighborSlot);
}
}
}
else
{
dataGridColumn = this.ColumnsInternal.GetNextVisibleWritableColumn(this.ColumnsItemsInternal[this.CurrentColumnIndex]);
neighborSlot = GetNextVisibleSlot(this.CurrentSlot);
if (this.EditingRow != null)
{
while (neighborSlot < this.SlotCount && this.RowGroupHeadersTable.Contains(neighborSlot))
{
neighborSlot = GetNextVisibleSlot(neighborSlot);
}
}
}
neighborVisibleWritableColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
if (neighborVisibleWritableColumnIndex == -1 && (neighborSlot == -1 || neighborSlot >= this.SlotCount))
{
// There is no previous/next row and no previous/next writable cell on the current row
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessTabKey(e, shift, ctrl); }))
{
return true;
}
int targetSlot = -1, targetColumnIndex = -1;
_noSelectionChangeCount++;
try
{
if (neighborVisibleWritableColumnIndex == -1)
{
targetSlot = neighborSlot;
if (shift)
{
DiagnosticsDebug.Assert(this.ColumnsInternal.LastVisibleWritableColumn != null, "Expected non-null ColumnsInternal.LastVisibleWritableColumn.");
targetColumnIndex = this.ColumnsInternal.LastVisibleWritableColumn.Index;
}
else
{
DiagnosticsDebug.Assert(this.ColumnsInternal.FirstVisibleWritableColumn != null, "Expected non-null ColumnsInternal.FirstVisibleWritableColumn.");
targetColumnIndex = this.ColumnsInternal.FirstVisibleWritableColumn.Index;
}
}
else
{
targetSlot = this.CurrentSlot;
targetColumnIndex = neighborVisibleWritableColumnIndex;
}
DataGridSelectionAction action;
if (targetSlot != this.CurrentSlot || (this.SelectionMode == DataGridSelectionMode.Extended))
{
if (IsSlotOutOfBounds(targetSlot))
{
return true;
}
action = DataGridSelectionAction.SelectCurrent;
}
else
{
action = DataGridSelectionAction.None;
}
UpdateSelectionAndCurrency(targetColumnIndex, targetSlot, action, true /*scrollIntoView*/);
}
finally
{
this.NoSelectionChangeCount--;
}
if (_successfullyUpdatedSelection && !this.RowGroupHeadersTable.Contains(targetSlot))
{
BeginCellEdit(e);
}
// Return true to say we handled the key event even if the operation was unsuccessful. If we don't
// say we handled this event, the framework will continue to process the tab key and change focus.
return true;
}
private bool ProcessUpKey(bool shift, bool ctrl)
{
DataGridColumn dataGridColumn = this.ColumnsInternal.FirstVisibleNonFillerColumn;
int firstVisibleColumnIndex = (dataGridColumn == null) ? -1 : dataGridColumn.Index;
int firstVisibleSlot = this.FirstVisibleSlot;
if (firstVisibleColumnIndex == -1 || firstVisibleSlot == -1)
{
return false;
}
if (this.WaitForLostFocus(() => { this.ProcessUpKey(shift, ctrl); }))
{
return true;
}
int previousVisibleSlot = (this.CurrentSlot != -1) ? GetPreviousVisibleSlot(this.CurrentSlot) : -1;
_noSelectionChangeCount++;
try
{
int slot;
int columnIndex;
DataGridSelectionAction action;
if (this.CurrentColumnIndex == -1)
{
slot = firstVisibleSlot;
columnIndex = firstVisibleColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
else if (ctrl)
{
if (shift)
{
// Both Ctrl and Shift
slot = firstVisibleSlot;
columnIndex = this.CurrentColumnIndex;
action = (this.SelectionMode == DataGridSelectionMode.Extended)
? DataGridSelectionAction.SelectFromAnchorToCurrent
: DataGridSelectionAction.SelectCurrent;
}
else
{
// Ctrl without Shift
slot = firstVisibleSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
}
else
{
if (previousVisibleSlot == -1)
{
return true;
}
if (shift)
{
// Shift without Ctrl
slot = previousVisibleSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectFromAnchorToCurrent;
}
else
{
// Neither Shift nor Ctrl
slot = previousVisibleSlot;
columnIndex = this.CurrentColumnIndex;
action = DataGridSelectionAction.SelectCurrent;
}
}
UpdateSelectionAndCurrency(columnIndex, slot, action, true /*scrollIntoView*/);
}
finally
{
this.NoSelectionChangeCount--;
}
return _successfullyUpdatedSelection;
}
private void RemoveDisplayedColumnHeader(DataGridColumn dataGridColumn)
{
if (_columnHeadersPresenter != null)
{
_columnHeadersPresenter.Children.Remove(dataGridColumn.HeaderCell);
}
}
private void RemoveDisplayedColumnHeaders()
{
if (_columnHeadersPresenter != null)
{
_columnHeadersPresenter.Children.Clear();
}
this.ColumnsInternal.FillerColumn.IsRepresented = false;
}
private bool ResetCurrentCellCore()
{
return this.CurrentColumnIndex == -1 || SetCurrentCellCore(-1, -1);
}
private void ResetEditingRow()
{
DataGridRow oldEditingRow = this.EditingRow;
if (oldEditingRow != null &&
oldEditingRow != _focusedRow &&
!IsSlotVisible(oldEditingRow.Slot))
{
// Unload the old editing row if it's off screen
oldEditingRow.Clip = null;
UnloadRow(oldEditingRow);
this.DisplayData.FullyRecycleElements();
}
this.EditingRow = null;
if (oldEditingRow != null && IsSlotVisible(oldEditingRow.Slot))
{
// If the row is no longer editing, then its visuals need to change.
oldEditingRow.ApplyState(true /*animate*/);
}
}
private void ResetFocusedRow()
{
if (_focusedRow != null &&
_focusedRow != this.EditingRow &&
!IsSlotVisible(_focusedRow.Slot))
{
// Unload the old focused row if it's off screen
_focusedRow.Clip = null;
UnloadRow(_focusedRow);
this.DisplayData.FullyRecycleElements();
}
_focusedRow = null;
}
private void ResetValidationStatus()
{
// Clear the invalid status of the Cell, Row and DataGrid
if (this.EditingRow != null)
{
this.EditingRow.IsValid = true;
if (this.EditingRow.Index != -1)
{
foreach (DataGridCell cell in this.EditingRow.Cells)
{
if (!cell.IsValid)
{
cell.IsValid = true;
cell.ApplyCellState(true);
}
}
this.EditingRow.ApplyState(true);
}
}
this.IsValid = true;
// Clear the previous validation results
_validationResults.Clear();
#if FEATURE_VALIDATION_SUMMARY
// Hide the error list if validation succeeded
if (_validationSummary != null && _validationSummary.Errors.Count > 0)
{
_validationSummary.Errors.Clear();
if (this.EditingRow != null)
{
int editingRowSlot = this.EditingRow.Slot;
InvalidateMeasure();
// TODO: Move to DispatcherQueue when FEATURE_VALIDATION_SUMMARY is enabled
this.Dispatcher.BeginInvoke(() =>
{
// It's possible that the DataContext or ItemsSource has changed by the time we reach this code,
// so we need to ensure that the editing row still exists before scrolling it into view
if (!IsSlotOutOfBounds(editingRowSlot) && editingRowSlot != -1)
{
ScrollSlotIntoView(editingRowSlot, false /*scrolledHorizontally*/);
}
});
}
}
#endif
}
private void RowGroupHeaderStyles_CollectionChanged(object sender, NotifyCollectionChangedEventArgs e)
{
if (_rowsPresenter != null)
{
Style oldLastStyle = _rowGroupHeaderStylesOld.Count > 0 ? _rowGroupHeaderStylesOld[_rowGroupHeaderStylesOld.Count - 1] : null;
while (_rowGroupHeaderStylesOld.Count < _rowGroupHeaderStyles.Count)
{
_rowGroupHeaderStylesOld.Add(oldLastStyle);
}
Style lastStyle = _rowGroupHeaderStyles.Count > 0 ? _rowGroupHeaderStyles[_rowGroupHeaderStyles.Count - 1] : null;
foreach (UIElement element in _rowsPresenter.Children)
{
DataGridRowGroupHeader groupHeader = element as DataGridRowGroupHeader;
if (groupHeader != null)
{
Style oldStyle = groupHeader.Level < _rowGroupHeaderStylesOld.Count ? _rowGroupHeaderStylesOld[groupHeader.Level] : oldLastStyle;
Style newStyle = groupHeader.Level < _rowGroupHeaderStyles.Count ? _rowGroupHeaderStyles[groupHeader.Level] : lastStyle;
EnsureElementStyle(groupHeader, oldStyle, newStyle);
}
}
}
_rowGroupHeaderStylesOld.Clear();
foreach (Style style in _rowGroupHeaderStyles)
{
_rowGroupHeaderStylesOld.Add(style);
}
}
private void SelectAll()
{
SetRowsSelection(0, this.SlotCount - 1);
}
private void SetAndSelectCurrentCell(
int columnIndex,
int slot,
bool forceCurrentCellSelection)
{
DataGridSelectionAction action = forceCurrentCellSelection ? DataGridSelectionAction.SelectCurrent : DataGridSelectionAction.None;
UpdateSelectionAndCurrency(columnIndex, slot, action, false /*scrollIntoView*/);
}
// columnIndex = 2, rowIndex = -1 --> current cell belongs to the 'new row'.
// columnIndex = 2, rowIndex = 2 --> current cell is an inner cell
// columnIndex = -1, rowIndex = -1 --> current cell is reset
// columnIndex = -1, rowIndex = 2 --> Unexpected
private bool SetCurrentCellCore(int columnIndex, int slot, bool commitEdit, bool endRowEdit)
{
DiagnosticsDebug.Assert(columnIndex < this.ColumnsItemsInternal.Count, "Expected columnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(slot < this.SlotCount, "Expected slot smaller than this.SlotCount.");
DiagnosticsDebug.Assert(columnIndex == -1 || this.ColumnsItemsInternal[columnIndex].IsVisible, "Expected columnIndex equals -1 or ColumnsItemsInternal[columnIndex].IsVisible is true.");
DiagnosticsDebug.Assert(columnIndex <= -1 || slot != -1, "Expected columnIndex smaller than or equal to -1 or slot other than -1.");
if (columnIndex == this.CurrentColumnIndex &&
slot == this.CurrentSlot)
{
DiagnosticsDebug.Assert(this.DataConnection != null, "Expected non-null DataConnection.");
DiagnosticsDebug.Assert(_editingColumnIndex == -1 || _editingColumnIndex == this.CurrentColumnIndex, "Expected _editingColumnIndex equals -1 or _editingColumnIndex equals CurrentColumnIndex.");
DiagnosticsDebug.Assert(this.EditingRow == null || this.EditingRow.Slot == this.CurrentSlot || this.DataConnection.EndingEdit, "Expected EditingRow is null or EditingRow.Slot equals CurrentSlot or DataConnection.EndingEdit is true.");
return true;
}
UIElement oldDisplayedElement = null;
DataGridCellCoordinates oldCurrentCell = new DataGridCellCoordinates(this.CurrentCellCoordinates);
object newCurrentItem = null;
if (!this.RowGroupHeadersTable.Contains(slot))
{
int rowIndex = this.RowIndexFromSlot(slot);
if (rowIndex >= 0 && rowIndex < this.DataConnection.Count)
{
newCurrentItem = this.DataConnection.GetDataItem(rowIndex);
}
}
if (this.CurrentColumnIndex > -1)
{
DiagnosticsDebug.Assert(this.CurrentColumnIndex < this.ColumnsItemsInternal.Count, "Expected CurrentColumnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.CurrentSlot < this.SlotCount, "Expected CurrentSlot smaller than SlotCount.");
if (!IsInnerCellOutOfBounds(oldCurrentCell.ColumnIndex, oldCurrentCell.Slot) &&
this.IsSlotVisible(oldCurrentCell.Slot))
{
oldDisplayedElement = this.DisplayData.GetDisplayedElement(oldCurrentCell.Slot);
}
if (!this.RowGroupHeadersTable.Contains(oldCurrentCell.Slot) && !_temporarilyResetCurrentCell)
{
bool keepFocus = this.ContainsFocus;
if (commitEdit)
{
if (!EndCellEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, keepFocus, true /*raiseEvents*/))
{
return false;
}
// Resetting the current cell: setting it to (-1, -1) is not considered setting it out of bounds
if ((columnIndex != -1 && slot != -1 && IsInnerCellOutOfSelectionBounds(columnIndex, slot)) ||
IsInnerCellOutOfSelectionBounds(oldCurrentCell.ColumnIndex, oldCurrentCell.Slot))
{
return false;
}
if (endRowEdit && !EndRowEdit(DataGridEditAction.Commit, true /*exitEditingMode*/, true /*raiseEvents*/))
{
return false;
}
}
else
{
this.CancelEdit(DataGridEditingUnit.Row, false);
ExitEdit(keepFocus);
}
}
}
if (newCurrentItem != null)
{
slot = this.SlotFromRowIndex(this.DataConnection.IndexOf(newCurrentItem));
}
if (slot == -1 && columnIndex != -1)
{
return false;
}
if (_noFocusedColumnChangeCount == 0)
{
this.ColumnHeaderHasFocus = false;
}
this.CurrentColumnIndex = columnIndex;
this.CurrentSlot = slot;
if (_temporarilyResetCurrentCell)
{
if (columnIndex != -1)
{
_temporarilyResetCurrentCell = false;
}
}
if (!_temporarilyResetCurrentCell && _editingColumnIndex != -1)
{
_editingColumnIndex = columnIndex;
}
if (oldDisplayedElement != null)
{
DataGridRow row = oldDisplayedElement as DataGridRow;
if (row != null)
{
// Don't reset the state of the current cell if we're editing it because that would put it in an invalid state
UpdateCurrentState(oldDisplayedElement, oldCurrentCell.ColumnIndex, !(_temporarilyResetCurrentCell && row.IsEditing && _editingColumnIndex == oldCurrentCell.ColumnIndex));
}
else
{
UpdateCurrentState(oldDisplayedElement, oldCurrentCell.ColumnIndex, false /*applyCellState*/);
}
}
if (this.CurrentColumnIndex > -1)
{
DiagnosticsDebug.Assert(this.CurrentSlot > -1, "Expected CurrentSlot greater than -1.");
DiagnosticsDebug.Assert(this.CurrentColumnIndex < this.ColumnsItemsInternal.Count, "Expected CurrentColumnIndex smaller than ColumnsItemsInternal.Count.");
DiagnosticsDebug.Assert(this.CurrentSlot < this.SlotCount, "Expected CurrentSlot smaller than SlotCount.");
if (this.IsSlotVisible(this.CurrentSlot))
{
UpdateCurrentState(this.DisplayData.GetDisplayedElement(this.CurrentSlot), this.CurrentColumnIndex, true /*applyCellState*/);
}
}
return true;
}
private void SetHorizontalOffset(double newHorizontalOffset)
{
if (_hScrollBar != null && _hScrollBar.Value != newHorizontalOffset)
{
_hScrollBar.Value = newHorizontalOffset;
// Unless the control is still loading, show the scroll bars when an offset changes. Keep the existing indicator type.
if (VisualTreeHelper.GetParent(this) != null)
{
ShowScrollBars();
}
}
}
private void SetVerticalOffset(double newVerticalOffset)
{
VerticalOffset = newVerticalOffset;
if (_vScrollBar != null && !DoubleUtil.AreClose(newVerticalOffset, _vScrollBar.Value))
{
_vScrollBar.Value = _verticalOffset;
// Unless the control is still loading, show the scroll bars when an offset changes. Keep the existing indicator type.
if (VisualTreeHelper.GetParent(this) != null)
{
ShowScrollBars();
}
}
}
#if FEATURE_VALIDATION_SUMMARY
/// <summary>
/// Determines whether or not a specific validation result should be displayed in the ValidationSummary.
/// </summary>
/// <param name="validationResult">Validation result to display.</param>
/// <returns>True if it should be added to the ValidationSummary, false otherwise.</returns>
private bool ShouldDisplayValidationResult(ValidationResult validationResult)
{
if (this.EditingRow != null)
{
return !_bindingValidationResults.ContainsEqualValidationResult(validationResult) ||
this.EditingRow.DataContext is IDataErrorInfo || this.EditingRow.DataContext is INotifyDataErrorInfo;
}
return false;
}
#endif
private void ShowScrollBars()
{
if (this.AreAllScrollBarsCollapsed)
{
_proposedScrollBarsState = ScrollBarVisualState.NoIndicator;
_proposedScrollBarsSeparatorState = ScrollBarsSeparatorVisualState.SeparatorCollapsedWithoutAnimation;
SwitchScrollBarsVisualStates(_proposedScrollBarsState, _proposedScrollBarsSeparatorState, false /*useTransitions*/);
}
else
{
if (_hideScrollBarsTimer != null && _hideScrollBarsTimer.IsRunning)
{
_hideScrollBarsTimer.Stop();
_hideScrollBarsTimer.Start();
}
// Mouse indicators dominate if they are already showing or if we have set the flag to prefer them.
if (_preferMouseIndicators || _showingMouseIndicators)
{
if (this.AreBothScrollBarsVisible && (_isPointerOverHorizontalScrollBar || _isPointerOverVerticalScrollBar))
{
_proposedScrollBarsState = ScrollBarVisualState.MouseIndicatorFull;
}
else
{
_proposedScrollBarsState = ScrollBarVisualState.MouseIndicator;
}
_showingMouseIndicators = true;
}
else
{
_proposedScrollBarsState = ScrollBarVisualState.TouchIndicator;
}
// Select the proper state for the scroll bars separator square within the GroupScrollBarsSeparator group:
if (UISettingsHelper.AreSettingsEnablingAnimations)
{
// When OS animations are turned on, show the square when a scroll bar is shown unless the DataGrid is disabled, using an animation.
_proposedScrollBarsSeparatorState =
this.IsEnabled &&
_proposedScrollBarsState == ScrollBarVisualState.MouseIndicatorFull ?
ScrollBarsSeparatorVisualState.SeparatorExpanded : ScrollBarsSeparatorVisualState.SeparatorCollapsed;
}
else
{
// OS animations are turned off. Show or hide the square depending on the presence of a scroll bars, without an animation.
// When the DataGrid is disabled, hide the square in sync with the scroll bar(s).
if (_proposedScrollBarsState == ScrollBarVisualState.MouseIndicatorFull)
{
_proposedScrollBarsSeparatorState = this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorExpandedWithoutAnimation : ScrollBarsSeparatorVisualState.SeparatorCollapsed;
}
else
{
_proposedScrollBarsSeparatorState = this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorCollapsedWithoutAnimation : ScrollBarsSeparatorVisualState.SeparatorCollapsed;
}
}
if (!UISettingsHelper.AreSettingsAutoHidingScrollBars)
{
if (this.AreBothScrollBarsVisible)
{
if (UISettingsHelper.AreSettingsEnablingAnimations)
{
SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicatorFull, this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorExpanded : ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
}
else
{
SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicatorFull, this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorExpandedWithoutAnimation : ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
}
}
else
{
if (UISettingsHelper.AreSettingsEnablingAnimations)
{
SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicator, ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
}
else
{
SwitchScrollBarsVisualStates(ScrollBarVisualState.MouseIndicator, this.IsEnabled ? ScrollBarsSeparatorVisualState.SeparatorCollapsedWithoutAnimation : ScrollBarsSeparatorVisualState.SeparatorCollapsed, true /*useTransitions*/);
}
}
}
else
{
SwitchScrollBarsVisualStates(_proposedScrollBarsState, _proposedScrollBarsSeparatorState, true /*useTransitions*/);
}
}
}
private void StopHideScrollBarsTimer()
{
if (_hideScrollBarsTimer != null && _hideScrollBarsTimer.IsRunning)
{
_hideScrollBarsTimer.Stop();
}
}
private void SwitchScrollBarsVisualStates(ScrollBarVisualState scrollBarsState, ScrollBarsSeparatorVisualState separatorState, bool useTransitions)
{
switch (scrollBarsState)
{
case ScrollBarVisualState.NoIndicator:
VisualStates.GoToState(this, useTransitions, VisualStates.StateNoIndicator);
if (!_hasNoIndicatorStateStoryboardCompletedHandler)
{
_showingMouseIndicators = false;
}
break;
case ScrollBarVisualState.TouchIndicator:
VisualStates.GoToState(this, useTransitions, VisualStates.StateTouchIndicator);
break;
case ScrollBarVisualState.MouseIndicator:
VisualStates.GoToState(this, useTransitions, VisualStates.StateMouseIndicator);
break;
case ScrollBarVisualState.MouseIndicatorFull:
VisualStates.GoToState(this, useTransitions, VisualStates.StateMouseIndicatorFull);
break;
}
switch (separatorState)
{
case ScrollBarsSeparatorVisualState.SeparatorCollapsed:
VisualStates.GoToState(this, useTransitions, VisualStates.StateSeparatorCollapsed);
break;
case ScrollBarsSeparatorVisualState.SeparatorExpanded:
VisualStates.GoToState(this, useTransitions, VisualStates.StateSeparatorExpanded);
break;
case ScrollBarsSeparatorVisualState.SeparatorExpandedWithoutAnimation:
VisualStates.GoToState(this, useTransitions, VisualStates.StateSeparatorExpandedWithoutAnimation);
break;
case ScrollBarsSeparatorVisualState.SeparatorCollapsedWithoutAnimation:
VisualStates.GoToState(this, useTransitions, VisualStates.StateSeparatorCollapsedWithoutAnimation);
break;
}
}
private void UnhookHorizontalScrollBarEvents()
{
if (_hScrollBar != null)
{
_hScrollBar.Scroll -= new ScrollEventHandler(HorizontalScrollBar_Scroll);
_hScrollBar.PointerEntered -= new PointerEventHandler(HorizontalScrollBar_PointerEntered);
_hScrollBar.PointerExited -= new PointerEventHandler(HorizontalScrollBar_PointerExited);
}
}
private void UnhookVerticalScrollBarEvents()
{
if (_vScrollBar != null)
{
_vScrollBar.Scroll -= new ScrollEventHandler(VerticalScrollBar_Scroll);
_vScrollBar.PointerEntered -= new PointerEventHandler(VerticalScrollBar_PointerEntered);
_vScrollBar.PointerExited -= new PointerEventHandler(VerticalScrollBar_PointerExited);
}
}
private void UpdateCurrentState(UIElement displayedElement, int columnIndex, bool applyCellState)
{
DataGridRow row = displayedElement as DataGridRow;
if (row != null)
{
if (this.AreRowHeadersVisible)
{
row.ApplyHeaderState(true /*animate*/);
}
DataGridCell cell = row.Cells[columnIndex];
if (applyCellState)
{
cell.ApplyCellState(true /*animate*/);
}
}
else
{
DataGridRowGroupHeader groupHeader = displayedElement as DataGridRowGroupHeader;
if (groupHeader != null)
{
groupHeader.ApplyState(true /*useTransitions*/);
if (this.AreRowHeadersVisible)
{
groupHeader.ApplyHeaderState(true /*animate*/);
}
}
}
}
private void UpdateDisabledVisual()
{
if (this.IsEnabled)
{
VisualStates.GoToState(this, true, VisualStates.StateNormal);
}
else
{
VisualStates.GoToState(this, true, VisualStates.StateDisabled, VisualStates.StateNormal);
}
}
private void UpdateHorizontalScrollBar(bool needHorizScrollBar, bool forceHorizScrollBar, double totalVisibleWidth, double totalVisibleFrozenWidth, double cellsWidth)
{
if (_hScrollBar != null)
{
if (needHorizScrollBar || forceHorizScrollBar)
{
// ..........viewportSize
// v---v
// |<|_____|###|>|
// ^ ^
// min max
// we want to make the relative size of the thumb reflect the relative size of the viewing area
// viewportSize / (max + viewportSize) = cellsWidth / max
// -> viewportSize = max * cellsWidth / (max - cellsWidth)
// always zero
_hScrollBar.Minimum = 0;
if (needHorizScrollBar)
{
// maximum travel distance -- not the total width
_hScrollBar.Maximum = totalVisibleWidth - cellsWidth;
DiagnosticsDebug.Assert(totalVisibleFrozenWidth >= 0, "Expected positive totalVisibleFrozenWidth.");
if (_frozenColumnScrollBarSpacer != null)
{
_frozenColumnScrollBarSpacer.Width = totalVisibleFrozenWidth;
}
DiagnosticsDebug.Assert(_hScrollBar.Maximum >= 0, "Expected positive _hScrollBar.Maximum.");
// width of the scrollable viewing area
double viewPortSize = Math.Max(0, cellsWidth - totalVisibleFrozenWidth);
_hScrollBar.ViewportSize = viewPortSize;
_hScrollBar.LargeChange = viewPortSize;
// The ScrollBar should be in sync with HorizontalOffset at this point. There's a resize case
// where the ScrollBar will coerce an old value here, but we don't want that.
SetHorizontalOffset(_horizontalOffset);
_hScrollBar.IsEnabled = true;
}
else
{
_hScrollBar.Maximum = 0;
_hScrollBar.ViewportSize = 0;
_hScrollBar.IsEnabled = false;
}
if (_hScrollBar.Visibility != Visibility.Visible)
{
// This will trigger a call to this method via Cells_SizeChanged for which no processing is needed.
_hScrollBar.Visibility = Visibility.Visible;
_ignoreNextScrollBarsLayout = true;
if (!this.IsHorizontalScrollBarOverCells && _hScrollBar.DesiredSize.Height == 0)
{
// We need to know the height for the rest of layout to work correctly so measure it now
_hScrollBar.Measure(new Size(double.PositiveInfinity, double.PositiveInfinity));
}
}
}
else
{
_hScrollBar.Maximum = 0;
if (_hScrollBar.Visibility != Visibility.Collapsed)
{
// This will trigger a call to this method via Cells_SizeChanged for which no processing is needed.
_hScrollBar.Visibility = Visibility.Collapsed;
_ignoreNextScrollBarsLayout = true;
}
}
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationScrollEvents();
}
}
}
#if FEATURE_IEDITABLECOLLECTIONVIEW
private void UpdateNewItemPlaceholder()
{
int placeholderSlot = SlotFromRowIndex(this.DataConnection.NewItemPlaceholderIndex);
if (this.DataConnection.NewItemPlaceholderPosition == NewItemPlaceholderPosition.AtEnd &&
_collapsedSlotsTable.Contains(placeholderSlot) != this.IsReadOnly)
{
if (this.IsReadOnly)
{
if (this.SelectedIndex == this.DataConnection.NewItemPlaceholderIndex)
{
this.SelectedIndex = Math.Max(-1, this.DataConnection.Count - 2);
}
if (this.IsSlotVisible(SlotFromRowIndex(this.DataConnection.NewItemPlaceholderIndex)))
{
this.RemoveDisplayedElement(placeholderSlot, false, true);
this.InvalidateRowsArrange();
}
_collapsedSlotsTable.AddValue(placeholderSlot, Visibility.Collapsed);
}
else
{
_collapsedSlotsTable.RemoveValue(placeholderSlot);
}
this.VisibleSlotCount = this.SlotCount - _collapsedSlotsTable.GetIndexCount(0, this.SlotCount - 1);
this.ComputeScrollBarsLayout();
}
}
#endif
private void UpdateRowDetailsVisibilityMode(DataGridRowDetailsVisibilityMode newDetailsMode)
{
if (_rowsPresenter != null && this.DataConnection.Count > 0)
{
Visibility newDetailsVisibility = Visibility.Collapsed;
switch (newDetailsMode)
{
case DataGridRowDetailsVisibilityMode.Visible:
newDetailsVisibility = Visibility.Visible;
break;
case DataGridRowDetailsVisibilityMode.Collapsed:
newDetailsVisibility = Visibility.Collapsed;
break;
case DataGridRowDetailsVisibilityMode.VisibleWhenSelected:
break;
}
this.ClearShowDetailsTable();
bool updated = false;
foreach (DataGridRow row in this.GetAllRows())
{
if (row.Visibility == Visibility.Visible)
{
if (newDetailsMode == DataGridRowDetailsVisibilityMode.VisibleWhenSelected)
{
// For VisibleWhenSelected, we need to calculate the value for each individual row
newDetailsVisibility = _selectedItems.ContainsSlot(row.Slot) && row.Index != this.DataConnection.NewItemPlaceholderIndex ? Visibility.Visible : Visibility.Collapsed;
}
if (row.DetailsVisibility != newDetailsVisibility)
{
updated = true;
row.SetDetailsVisibilityInternal(
newDetailsVisibility,
true /*raiseNotification*/);
}
}
}
if (updated)
{
UpdateDisplayedRows(this.DisplayData.FirstScrollingSlot, this.CellsHeight);
InvalidateRowsMeasure(false /*invalidateIndividualElements*/);
}
}
}
private void UpdateRowsPresenterManipulationMode(bool horizontalMode, bool verticalMode)
{
if (_rowsPresenter != null)
{
ManipulationModes manipulationMode = _rowsPresenter.ManipulationMode;
if (horizontalMode)
{
if (this.HorizontalScrollBarVisibility != ScrollBarVisibility.Disabled)
{
manipulationMode |= ManipulationModes.TranslateX | ManipulationModes.TranslateInertia;
}
else
{
manipulationMode &= ~(ManipulationModes.TranslateX | ManipulationModes.TranslateRailsX);
}
}
if (verticalMode)
{
if (this.VerticalScrollBarVisibility != ScrollBarVisibility.Disabled)
{
manipulationMode |= ManipulationModes.TranslateY | ManipulationModes.TranslateInertia;
}
else
{
manipulationMode &= ~(ManipulationModes.TranslateY | ManipulationModes.TranslateRailsY);
}
}
if ((manipulationMode & (ManipulationModes.TranslateX | ManipulationModes.TranslateY)) == (ManipulationModes.TranslateX | ManipulationModes.TranslateY))
{
manipulationMode |= ManipulationModes.TranslateRailsX | ManipulationModes.TranslateRailsY;
}
if ((manipulationMode & (ManipulationModes.TranslateX | ManipulationModes.TranslateRailsX | ManipulationModes.TranslateY | ManipulationModes.TranslateRailsY)) ==
ManipulationModes.None)
{
manipulationMode &= ~ManipulationModes.TranslateInertia;
}
_rowsPresenter.ManipulationMode = manipulationMode;
}
}
private bool UpdateStateOnTapped(TappedRoutedEventArgs args, int columnIndex, int slot, bool allowEdit, bool shift, bool ctrl)
{
bool beginEdit;
DiagnosticsDebug.Assert(slot >= 0, "Expected positive slot.");
// Before changing selection, check if the current cell needs to be committed, and
// check if the current row needs to be committed. If any of those two operations are required and fail,
// do not change selection, and do not change current cell.
bool wasInEdit = this.EditingColumnIndex != -1;
if (IsSlotOutOfBounds(slot))
{
return true;
}
if (wasInEdit && (columnIndex != this.EditingColumnIndex || slot != this.CurrentSlot) &&
this.WaitForLostFocus(() => { this.UpdateStateOnTapped(args, columnIndex, slot, allowEdit, shift, ctrl); }))
{
return true;
}
try
{
_noSelectionChangeCount++;
beginEdit = allowEdit &&
this.CurrentSlot == slot &&
columnIndex != -1 &&
(wasInEdit || this.CurrentColumnIndex == columnIndex) &&
!GetColumnEffectiveReadOnlyState(this.ColumnsItemsInternal[columnIndex]);
DataGridSelectionAction action;
if (this.SelectionMode == DataGridSelectionMode.Extended && shift)
{
// Shift select multiple rows.
action = DataGridSelectionAction.SelectFromAnchorToCurrent;
}
else if (GetRowSelection(slot))
{
// Unselecting single row or Selecting a previously multi-selected row.
if (!ctrl && this.SelectionMode == DataGridSelectionMode.Extended && _selectedItems.Count != 0)
{
// Unselect everything except the row that was clicked on.
action = DataGridSelectionAction.SelectCurrent;
}
else if (ctrl && this.EditingRow == null)
{
action = DataGridSelectionAction.RemoveCurrentFromSelection;
}
else
{
action = DataGridSelectionAction.None;
}
}
else
{
// Selecting a single row or multi-selecting with Ctrl.
if (this.SelectionMode == DataGridSelectionMode.Single || !ctrl)
{
// Unselect the correctly selected rows except the new selected row.
action = DataGridSelectionAction.SelectCurrent;
}
else
{
action = DataGridSelectionAction.AddCurrentToSelection;
}
}
UpdateSelectionAndCurrency(columnIndex, slot, action, false /*scrollIntoView*/);
}
finally
{
this.NoSelectionChangeCount--;
}
if (_successfullyUpdatedSelection && beginEdit && BeginCellEdit(args))
{
FocusEditingCell(true /*setFocus*/);
}
return true;
}
/// <summary>
/// Updates the DataGrid's validation results, modifies the ValidationSummary's items,
/// and sets the IsValid states of the UIElements.
/// </summary>
/// <param name="newValidationResults">New validation results.</param>
/// <param name="scrollIntoView">If the validation results have changed, scrolls the editing row into view.</param>
private void UpdateValidationResults(List<ValidationResult> newValidationResults, bool scrollIntoView)
{
bool validationResultsChanged = false;
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
// Remove the validation results that have been fixed
List<ValidationResult> removedValidationResults = new List<ValidationResult>();
foreach (ValidationResult oldValidationResult in _validationResults)
{
if (oldValidationResult != null && !newValidationResults.ContainsEqualValidationResult(oldValidationResult))
{
removedValidationResults.Add(oldValidationResult);
validationResultsChanged = true;
}
}
foreach (ValidationResult removedValidationResult in removedValidationResults)
{
_validationResults.Remove(removedValidationResult);
#if FEATURE_VALIDATION_SUMMARY
if (_validationSummary != null)
{
ValidationSummaryItem removedValidationSummaryItem = this.FindValidationSummaryItem(removedValidationResult);
if (removedValidationSummaryItem != null)
{
_validationSummary.Errors.Remove(removedValidationSummaryItem);
}
}
#endif
}
// Add any validation results that were just introduced
foreach (ValidationResult newValidationResult in newValidationResults)
{
if (newValidationResult != null && !_validationResults.ContainsEqualValidationResult(newValidationResult))
{
_validationResults.Add(newValidationResult);
#if FEATURE_VALIDATION_SUMMARY
if (_validationSummary != null && ShouldDisplayValidationResult(newValidationResult))
{
ValidationSummaryItem newValidationSummaryItem = this.CreateValidationSummaryItem(newValidationResult);
if (newValidationSummaryItem != null)
{
_validationSummary.Errors.Add(newValidationSummaryItem);
}
}
#endif
validationResultsChanged = true;
}
}
if (validationResultsChanged)
{
this.UpdateValidationStatus();
}
if (!this.IsValid && scrollIntoView)
{
// Scroll the row with the error into view.
int editingRowSlot = this.EditingRow.Slot;
#if FEATURE_VALIDATION_SUMMARY
if (_validationSummary != null)
{
// If the number of errors has changed, then the ValidationSummary will be a different size,
// and we need to delay our call to ScrollSlotIntoView
this.InvalidateMeasure();
// TODO: Move to DispatcherQueue when FEATURE_VALIDATION_SUMMARY is enabled
this.Dispatcher.BeginInvoke(() =>
{
// It's possible that the DataContext or ItemsSource has changed by the time we reach this code,
// so we need to ensure that the editing row still exists before scrolling it into view
if (!this.IsSlotOutOfBounds(editingRowSlot) && editingRowSlot != -1)
{
this.ScrollSlotIntoView(editingRowSlot, false /*scrolledHorizontally*/);
}
});
}
else
#endif
{
this.ScrollSlotIntoView(editingRowSlot, false /*scrolledHorizontally*/);
}
}
}
/// <summary>
/// Updates the IsValid states of the DataGrid, the EditingRow and its cells. All cells related to
/// property-level errors are set to Invalid. If there is an object-level error selected in the
/// ValidationSummary, then its associated cells will also be flagged (if there are any).
/// </summary>
private void UpdateValidationStatus()
{
if (this.EditingRow != null)
{
foreach (DataGridCell cell in this.EditingRow.Cells)
{
bool isCellValid = true;
DiagnosticsDebug.Assert(cell.OwningColumn != null, "Expected cell has owning column.");
if (!cell.OwningColumn.IsReadOnly)
{
foreach (ValidationResult validationResult in _validationResults)
{
bool validationResultIsSelectedValidationSummaryItemContext = false;
#if FEATURE_VALIDATION_SUMMARY
validationResultIsSelectedValidationSummaryItemContext = _selectedValidationSummaryItem != null && _selectedValidationSummaryItem.Context == validationResult;
#endif
if (_propertyValidationResults.ContainsEqualValidationResult(validationResult) ||
validationResultIsSelectedValidationSummaryItemContext)
{
foreach (string bindingPath in validationResult.MemberNames)
{
if (cell.OwningColumn.BindingPaths.Contains(bindingPath))
{
isCellValid = false;
break;
}
}
}
}
}
if (cell.IsValid != isCellValid)
{
cell.IsValid = isCellValid;
cell.ApplyCellState(true /*animate*/);
}
}
bool isRowValid = _validationResults.Count == 0;
if (this.EditingRow.IsValid != isRowValid)
{
this.EditingRow.IsValid = isRowValid;
this.EditingRow.ApplyState(true /*animate*/);
}
this.IsValid = isRowValid;
}
else
{
this.IsValid = true;
}
}
private void UpdateVerticalScrollBar(bool needVertScrollBar, bool forceVertScrollBar, double totalVisibleHeight, double cellsHeight)
{
if (_vScrollBar != null)
{
if (needVertScrollBar || forceVertScrollBar)
{
// ..........viewportSize
// v---v
// |<|_____|###|>|
// ^ ^
// min max
// we want to make the relative size of the thumb reflect the relative size of the viewing area
// viewportSize / (max + viewportSize) = cellsWidth / max
// -> viewportSize = max * cellsHeight / (totalVisibleHeight - cellsHeight)
// -> = max * cellsHeight / (totalVisibleHeight - cellsHeight)
// -> = max * cellsHeight / max
// -> = cellsHeight
// always zero
_vScrollBar.Minimum = 0;
if (needVertScrollBar && !double.IsInfinity(cellsHeight))
{
// maximum travel distance -- not the total height
_vScrollBar.Maximum = totalVisibleHeight - cellsHeight;
DiagnosticsDebug.Assert(_vScrollBar.Maximum >= 0, "Expected positive _vScrollBar.Maximum.");
// total height of the display area
_vScrollBar.ViewportSize = cellsHeight;
_vScrollBar.LargeChange = cellsHeight;
_vScrollBar.IsEnabled = true;
}
else
{
_vScrollBar.Maximum = 0;
_vScrollBar.ViewportSize = 0;
_vScrollBar.IsEnabled = false;
}
if (_vScrollBar.Visibility != Visibility.Visible)
{
// This will trigger a call to this method via Cells_SizeChanged for which no processing is needed.
_vScrollBar.Visibility = Visibility.Visible;
_ignoreNextScrollBarsLayout = true;
if (!this.IsVerticalScrollBarOverCells && _vScrollBar.DesiredSize.Width == 0)
{
// We need to know the width for the rest of layout to work correctly so measure it now.
_vScrollBar.Measure(new Size(double.PositiveInfinity, double.PositiveInfinity));
}
}
}
else
{
_vScrollBar.Maximum = 0;
if (_vScrollBar.Visibility != Visibility.Collapsed)
{
// This will trigger a call to this method via Cells_SizeChanged for which no processing is needed.
_vScrollBar.Visibility = Visibility.Collapsed;
_ignoreNextScrollBarsLayout = true;
}
}
DataGridAutomationPeer peer = DataGridAutomationPeer.FromElement(this) as DataGridAutomationPeer;
if (peer != null)
{
peer.RaiseAutomationScrollEvents();
}
}
}
/// <summary>
/// Validates the current editing row and updates the visual states.
/// </summary>
/// <param name="scrollIntoView">If true, will scroll the editing row into view when a new error is introduced.</param>
/// <param name="wireEvents">If true, subscribes to the asynchronous INDEI ErrorsChanged events.</param>
/// <returns>True if the editing row is valid, false otherwise.</returns>
private bool ValidateEditingRow(bool scrollIntoView, bool wireEvents)
{
List<ValidationResult> validationResults;
if (_initializingNewItem)
{
// We only want to run property validation if we're initializing a new item. Instead of
// clearing all the errors, we will only remove those associated with the current column.
validationResults = new List<ValidationResult>(_validationResults);
}
else
{
// We're going to run full entity-level validation, so throw away the
// old errors since they will be recreated if they're still active.
_propertyValidationResults.Clear();
_indeiValidationResults.Clear();
validationResults = new List<ValidationResult>();
}
if (this.EditingRow != null)
{
object dataItem = this.EditingRow.DataContext;
DiagnosticsDebug.Assert(dataItem != null, "Expected non-null dataItem.");
if (!_initializingNewItem)
{
// Validate using the Validator.
ValidationContext context = new ValidationContext(dataItem, null, null);
Validator.TryValidateObject(dataItem, context, validationResults, true);
#if FEATURE_IDATAERRORINFO
// IDEI entity validation.
this.ValidateIdei(dataItem as IDataErrorInfo, null, null, validationResults);
#endif
// INDEI entity validation.
this.ValidateIndei(dataItem as INotifyDataErrorInfo, null, null, null, validationResults, wireEvents);
}
// IDEI and INDEI property validation.
foreach (DataGridColumn column in this.ColumnsInternal.GetDisplayedColumns(c => c.IsVisible && !c.IsReadOnly))
{
if (!_initializingNewItem || column == this.CurrentColumn)
{
foreach (string bindingPath in column.BindingPaths)
{
string declaringPath = null;
object declaringItem = dataItem;
string bindingProperty = bindingPath;
// Check for nested paths.
int lastIndexOfSeparator = bindingPath.LastIndexOfAny(new char[] { TypeHelper.PropertyNameSeparator, TypeHelper.LeftIndexerToken });
if (lastIndexOfSeparator >= 0)
{
declaringPath = bindingPath.Substring(0, lastIndexOfSeparator);
declaringItem = TypeHelper.GetNestedPropertyValue(dataItem, declaringPath);
if (bindingProperty[lastIndexOfSeparator] == TypeHelper.LeftIndexerToken)
{
bindingProperty = TypeHelper.PrependDefaultMemberName(declaringItem, bindingPath.Substring(lastIndexOfSeparator));
}
else
{
bindingProperty = bindingPath.Substring(lastIndexOfSeparator + 1);
}
}
if (_initializingNewItem)
{
// We're only re-validating the current column, so remove its old errors
// because we're about to check if they're still relevant.
foreach (ValidationResult oldValidationResult in _validationResults)
{
if (oldValidationResult != null && oldValidationResult.ContainsMemberName(bindingPath))
{
validationResults.Remove(oldValidationResult);
_indeiValidationResults.Remove(oldValidationResult);
_propertyValidationResults.Remove(oldValidationResult);
}
}
}
#if FEATURE_IDATAERRORINFO
// IDEI property validation.
this.ValidateIdei(declaringItem as IDataErrorInfo, bindingProperty, bindingPath, validationResults);
#endif
// INDEI property validation.
this.ValidateIndei(declaringItem as INotifyDataErrorInfo, bindingProperty, bindingPath, declaringPath, validationResults, wireEvents);
}
}
}
// Add any existing exception errors (in case we're editing a cell).
// Note: these errors will only be displayed in the ValidationSummary if the
// editing data item implements IDEI or INDEI.
foreach (ValidationResult validationResult in _bindingValidationResults)
{
validationResults.AddIfNew(validationResult);
_propertyValidationResults.AddIfNew(validationResult);
}
// Merge the new validation results with the existing ones.
this.UpdateValidationResults(validationResults, scrollIntoView);
// Return false if there are validation errors.
if (!this.IsValid)
{
return false;
}
}
// Return true if there are no errors or there is no editing row.
this.ResetValidationStatus();
return true;
}
#if FEATURE_IDATAERRORINFO
/// <summary>
/// Checks an IDEI data object for errors for the specified property. New errors are added to the
/// list of validation results.
/// </summary>
/// <param name="idei">IDEI object to validate.</param>
/// <param name="bindingProperty">Name of the property to validate.</param>
/// <param name="bindingPath">Path of the binding.</param>
/// <param name="validationResults">List of results to add to.</param>
private void ValidateIdei(IDataErrorInfo idei, string bindingProperty, string bindingPath, List<ValidationResult> validationResults)
{
if (idei != null)
{
string errorString = null;
if (string.IsNullOrEmpty(bindingProperty))
{
DiagnosticsDebug.Assert(string.IsNullOrEmpty(bindingPath));
ValidationUtil.CatchNonCriticalExceptions(() => { errorString = idei.Error; });
if (!string.IsNullOrEmpty(errorString))
{
validationResults.AddIfNew(new ValidationResult(errorString));
}
}
else
{
ValidationUtil.CatchNonCriticalExceptions(() => { errorString = idei[bindingProperty]; });
if (!string.IsNullOrEmpty(errorString))
{
ValidationResult validationResult = new ValidationResult(errorString, new List<string>() { bindingPath });
validationResults.AddIfNew(validationResult);
_propertyValidationResults.Add(validationResult);
}
}
}
}
#endif
/// <summary>
/// Checks an INDEI data object for errors on the specified path. New errors are added to the
/// list of validation results.
/// </summary>
/// <param name="indei">INDEI object to validate.</param>
/// <param name="bindingProperty">Name of the property to validate.</param>
/// <param name="bindingPath">Path of the binding.</param>
/// <param name="declaringPath">Path of the INDEI object.</param>
/// <param name="validationResults">List of results to add to.</param>
/// <param name="wireEvents">True if the ErrorsChanged event should be subscribed to.</param>
private void ValidateIndei(INotifyDataErrorInfo indei, string bindingProperty, string bindingPath, string declaringPath, List<ValidationResult> validationResults, bool wireEvents)
{
if (indei != null)
{
if (indei.HasErrors)
{
IEnumerable errors = null;
ValidationUtil.CatchNonCriticalExceptions(() => { errors = indei.GetErrors(bindingProperty); });
if (errors != null)
{
foreach (object errorItem in errors)
{
if (errorItem != null)
{
string errorString = null;
ValidationUtil.CatchNonCriticalExceptions(() => { errorString = errorItem.ToString(); });
if (!string.IsNullOrEmpty(errorString))
{
ValidationResult validationResult;
if (!string.IsNullOrEmpty(bindingProperty))
{
validationResult = new ValidationResult(errorString, new List<string>() { bindingPath });
_propertyValidationResults.Add(validationResult);
}
else
{
DiagnosticsDebug.Assert(string.IsNullOrEmpty(bindingPath), "Expected bindingPath is null or empty.");
validationResult = new ValidationResult(errorString);
}
validationResults.AddIfNew(validationResult);
_indeiValidationResults.AddIfNew(validationResult);
}
}
}
}
}
if (wireEvents && !_validationItems.ContainsKey(indei))
{
_validationItems.Add(indei, declaringPath);
indei.ErrorsChanged += new EventHandler<DataErrorsChangedEventArgs>(ValidationItem_ErrorsChanged);
}
}
}
/// <summary>
/// Handles the asynchronous INDEI errors that occur while the DataGrid is in editing mode.
/// </summary>
/// <param name="sender">INDEI item whose errors changed.</param>
/// <param name="e">Error event arguments.</param>
private void ValidationItem_ErrorsChanged(object sender, DataErrorsChangedEventArgs e)
{
INotifyDataErrorInfo indei = sender as INotifyDataErrorInfo;
if (_validationItems.ContainsKey(indei))
{
DiagnosticsDebug.Assert(this.EditingRow != null, "Expected non-null EditingRow.");
// Determine the binding path.
string bindingPath = _validationItems[indei];
if (string.IsNullOrEmpty(bindingPath))
{
bindingPath = e.PropertyName;
}
else if (!string.IsNullOrEmpty(e.PropertyName) && e.PropertyName.IndexOf(TypeHelper.LeftIndexerToken) >= 0)
{
bindingPath += TypeHelper.RemoveDefaultMemberName(e.PropertyName);
}
else
{
bindingPath += TypeHelper.PropertyNameSeparator + e.PropertyName;
}
// Remove the old errors.
List<ValidationResult> validationResults = new List<ValidationResult>();
foreach (ValidationResult validationResult in _validationResults)
{
ValidationResult oldValidationResult = _indeiValidationResults.FindEqualValidationResult(validationResult);
if (oldValidationResult != null && oldValidationResult.ContainsMemberName(bindingPath))
{
_indeiValidationResults.Remove(oldValidationResult);
}
else
{
validationResults.Add(validationResult);
}
}
// Find any new errors and update the visuals.
this.ValidateIndei(indei, e.PropertyName, bindingPath, null, validationResults, false /*wireEvents*/);
this.UpdateValidationResults(validationResults, false /*scrollIntoView*/);
// If we're valid now then reset our status.
if (this.IsValid)
{
this.ResetValidationStatus();
}
}
else if (indei != null)
{
indei.ErrorsChanged -= new EventHandler<DataErrorsChangedEventArgs>(ValidationItem_ErrorsChanged);
}
}
private void VerticalScrollBar_Scroll(object sender, ScrollEventArgs e)
{
ProcessVerticalScroll(e.ScrollEventType);
}
}
}
``` |
The Sibiu Cycling Tour (Cycling Tour of Sibiu until 2015) is a 2.1 category professional bicycle road race held in Sibiu, Romania. Its first edition took place in July 2011, as part of the UCI Europe Tour. The race is organised with the support of the local council as well as the regional council of Sibiu. Held entirely around the city, the race normally runs over four days including a prologue on the cobbled streets of the city, and two climbing stages, one on the Transfăgărășan road to Bâlea Lake and a second to the mountain resort of Păltiniș.
Overall winners
Classifications
As of the 2018 edition, the jerseys worn by the leaders of the individual classifications are:
– Yellow Jersey – The Yellow Jersey is worn by the leader of the overall classification.
– White Jersey – The White Jersey is worn by the leader of the overall mountains classification.(white jersey prior to 2018)
– Orange Jersey – Worn by the best rider under 23 years of age on the overall classification.
– Blue Jersey – Worn by the leader of the sprints classification.
– Red Jersey – The Red Jersey presented to the leading Romanian rider on the overall classification.
– Green Jersey – Presented to the leader of the points classification. (Previously wore a white jersey)
Additionally
– Grey Jersey – To the team leading the team classification (Not worn in race)
From 2018 the red jersey and green jerseys were presented on the podium only and not worn in race.
Editions
2011
The Cycling Tour of Sibiu 2011 took place from 6 to 10 July 2011, organised as a 2.2 race on the UCI Europe Tour, over a total distance of . The race included five days of competition including a team time trial in the center of Sibiu. A total of 20 teams took part, with a total prize money of 26,000 euros. The race was originally won by Vladimir Koev but he was later stripped of all results from 2010 and 2011 following a positive test at the 2010 Tour of Romania.
2012
The Cycling Tour of Sibiu 2012 took place from 4 to 8 July 2012, organised as a 2.2 race on the UCI Europe Tour. The race for the first time included an opening prologue time trial and covered a total of .
2013
The Cycling Tour of Sibiu 2013 took place from 11 to 14 July. For the third edition the race was upgraded to UCI category 2.1 allowing UCI Pro Continental Teams to take part. Three Pro Continental teams accepted invites, , and although Vini Fantini would later withdraw after positive doping tests at the 2013 Giro d'Italia. At , the race was the longest to date despite being reduced to four days, with two stages taking place on the final day.
2014
The 2014 Sibiu Tour took place between 17 and 20 July. At , the race was the longest to date, and once more featured the traditional cobbled prologue and stages to Bâlea Lake and Paltanis. Returning to the race for the first time since 2012 was a team time trial on the final day. The 2014 race featured two Pro Continental teams, and along with 20 continental and national teams competing for a prize fund of €29,889.
2015
The 2015 Tour of Sibiu took place between 1 and 5 July. For the first time it was raced over 5 days, and moved forward in the calendar by nearly three weeks. It was expected that the teams of all the jersey winners and stage winners from 2014, , , and , would compete again in 2015. Adria Mobil later withdrew to be replaced by taking the number of pro-continental teams in the race to four. The race was won by Mauro Finetto who won the mountain stage to Paltanis and was able to retain his jersey through to the finale.
2016
The 2016 Sibiu Cycling Tour took place between 6 and 10 July having moved forward one week due to the local elections. The race opened with the traditional prologue and for the first time featured a mountain time trial to Bâlea Lake. This edition featured four pro-continental teams including for the first time, a British team, .
The race was won by Nikolay Mihaylov after he was part of a breakaway on Stage 2. The race was notable for its first Romanian stage winner, Andrei Nechita, who won the opening prologue, and also its first Australian stage winner Steele Von Hoff.
2017
The 2017 Sibiu Cycling Tour took place between 5 and 9 July, featuring a traditional parcours of opening prologue, two intermediate and two mountain stages. The peloton featured three professional Continental teams, 17 Continental teams and a Romanian national team, and for the first time, teams from North America. The race was won by Egan Bernal who became the first Colombian winner.
2018
The 2018 Sibiu Cycling Tour took place between 5 and 8 July, featuring a traditional parcours of opening prologue, two mountain stages and for the first time since 2014, a team time trial. The peloton featured three professional Continental teams, fourteen Continental teams and two national teams.
Notes
References
External links
UCI Europe Tour races
Cycle races in Romania
Recurring sporting events established in 2011
2011 establishments in Romania
Summer events in Romania |
András Máté Gömöri (born August 29, 1992) is an Hungarian actor, bodybuilder and powerlifter
Personal life
András Máté Gömöri was only 10 years old when he lost his mother, a mathematics teacher, and 16 when his father, an army officer and engineer, passed away. He has two brothers, Péter and Gergely, who are 16 and 12 years his elders respectively. He moved to Kisgyőr with his father, who at the time was living on disability benefits due to heart disease. This is where he spent his childhood, and where his father met Gömöri’s future stepmother, Judit. He moved to a dormitory in Miskolc when he began his studies in Bláthy Ottó Electrical Engineering Vocational Secondary School (Bláthy Ottó Villamosipari Technikum). He spent a lot of time with his elder brother who lived in Miskolc, and later with his other brother, who lived in Pest.
He married Lilla Polyák in the summer of 2018.
Acting career
He was still a child when he got his first role in 2004 in the National Theatre of Miskolc.
He began his studies in the Pesti Magyar Academy of Drama at the Magyar Theatre — the successor of the Nemzeti Stúdió (Academy of the National Theatre) — in 2011, but his attraction gradually shifted from plays to musical theatre. In 2012 he applied for an acting student spot in the Pesti Broadway Studio at the Budapest Operetta Theater, but was offered the main role of Rudolph in the musical Elisabeth instead. From then on, he has acted in musicals such as Romeo and Juliet, Fame, Gone with the Wind, Flowers for Algernon, The Hunchback of Notre Dame, Singin' in the Rain, and Jekyll and Hyde.
In October 2018 he also debuted in prose in the drama The Night of the Tribades in the Pinceszínház located in Budapest.
Between 2017 and 2020 he completed the drama instructor and actor program of the University of Theatre and Film Arts in Budapest. His thesis explored the portrayal of negative characters.
Since 2014 he has appeared in various television series and the Hungarian movie Budapest Noir. For six months in 2018 he hosted the morning show on the television channel FEM3, and from January 2020 he has been hosting a lifestyle show at the channel TV2. He has also been involved with several events and galas.
In 2019 he was nominated for Playboy’s (Marquard Media Hungary) Man of the Year award in the performance arts category.
Sports career
At school, in addition to playing volleyball and football, he did athletics and participated in triathlons. Later on, he turned to archery, and started competing in 2004. It was around 2010 that he was introduced to gym workouts, which started to play a significant role in his life after moving to Budapest. In 2012, he was already an avid bodybuilder, swimmer and runner. In 2014 he was presented with the Fittest Actor of the Year award at the third Fitbalance Award Gala, which he won again in 2019.
Bodybuilding career
In his first natural bodybuilding competition, which was held by the INBA (International Natural Bodybuilding Association) in Hungary in the summer of 2019, he won the novice as well as the overall category. His trainer was Csaba Kalas, who has won multiple natural bodybuilding championships. In the following year he took a break from competing to focus on his studies, but had already decided to enter the 2021 Natural Olympia in Las Vegas. In the meantime, he was offered a sponsorshipdeal by a nutritional and dietary supplement company.
On October 1, 2021 he won a silver medal in the INBA Elite Tour & Pro Show natural bodybuilding competition in Pécs. This allowed him to enter the world championship on 30 October in Bucharest, where he came in fifth. In November he became the champion and then the overall winner of the Natural Olympia Classic Physique Toll, and earned his pro card. He also won second place in the Open Bodybuilding category in Las Vegas.
Powerlifting career
In 2020 he took up another sport: powerlifting. His first competition was the RAW Powerlifting Hungarian Championship (RAW Erőemelő Magyar Bajnokság) held in 2021 in Budapest. Although his training still focused on bodybuilding, he came in 11 th . In 2021 he became the founding competitor of the newly established powerlifting department of the Diósgyőri VTK sports club. In the Athletes Men’s Open RAW Powerlifting Hungarian Championship (Athletes Férfi Open RAW Erőemelő Magyar Bajnokság) held in April 2022, he finished tenth in the 105 kg group. He was trained by Miklós Fekete, a captain of the National Powerlifting Association and a competitor in Sirius Lifting SE, and Tamás Neszveda powerlifting trainer.
Awards and honors
Natural Olympia, Las Vegas, 2021
Classic Physique Overall Champion – pro card winner
Classic Physique Toll Champion
Second place – Men's Bodybuilding - Open Tall
Superbody, Budapest, 2021
Second place – Superbody Athletic category
INBA / PNBA World Championships, Bucharest, 2021
Fifth place
INBA EliteTour & ProShow, Pécs, 2021
Second place – Natural Bodybuilding – Open 175+
INBA Hungary Natural Bodybuilding INBA Grand Prix, Miskolc, 2019
Bodybuilding Absolute Champion
Bodybuilding Men’s – Novice 180+ Champion
Natural Bodybuilding – Open 180+ Champion
Fitbalance Award Gála 2014, 2019
Fittest Actor of the Year award
Csillag-Award (Budapest Operetta Theater), 2013
Discovery of the Year
Theatrical roles
Filmography
Film
Television
References
External links
1992 births
Living people
Hungarian male stage actors
Hungarian male film actors
Hungarian male television actors
Hungarian bodybuilders |
```objective-c
#pragma once
#include <Parsers/IParserBase.h>
#include <Parsers/MySQL/ASTDeclareIndex.h>
#include <Parsers/MySQL/ASTDeclareColumn.h>
#include <Parsers/MySQL/ASTDeclareTableOptions.h>
namespace DB
{
namespace ErrorCodes
{
extern const int NOT_IMPLEMENTED;
}
namespace MySQLParser
{
class ASTDropQuery : public IAST
{
public:
enum Kind
{
Table,
View,
Database,
Index,
/// TRIGGER,FUNCTION,EVENT and so on, No need for support
Other,
};
Kind kind;
struct QualifiedName
{
String schema;
String shortName;
};
using QualifiedNames = std::vector<QualifiedName>;
QualifiedNames names;
bool if_exists{false};
//drop or truncate
bool is_truncate{false};
ASTPtr clone() const override;
String getID(char /*delim*/) const override {return "ASTDropQuery" ;}
protected:
void formatImpl(const FormatSettings & /*settings*/, FormatState & /*state*/, FormatStateStacked /*frame*/) const override
{
throw Exception(ErrorCodes::NOT_IMPLEMENTED, "Method formatImpl is not supported by MySQLParser::ASTDropQuery.");
}
};
class ParserDropQuery : public IParserBase
{
protected:
const char * getName() const override { return "DROP query"; }
bool parseImpl(Pos & pos, ASTPtr & node, Expected & expected) override;
};
}
}
``` |
```go
package images // import "github.com/docker/docker/daemon/images"
import (
"encoding/json"
"io"
"github.com/docker/docker/api/types/backend"
"github.com/docker/docker/image"
"github.com/docker/docker/layer"
"github.com/docker/docker/pkg/ioutils"
"github.com/docker/docker/pkg/system"
"github.com/pkg/errors"
)
// CommitImage creates a new image from a commit config
func (i *ImageService) CommitImage(c backend.CommitConfig) (image.ID, error) {
layerStore, ok := i.layerStores[c.ContainerOS]
if !ok {
return "", system.ErrNotSupportedOperatingSystem
}
rwTar, err := exportContainerRw(layerStore, c.ContainerID, c.ContainerMountLabel)
if err != nil {
return "", err
}
defer func() {
if rwTar != nil {
rwTar.Close()
}
}()
var parent *image.Image
if c.ParentImageID == "" {
parent = new(image.Image)
parent.RootFS = image.NewRootFS()
} else {
parent, err = i.imageStore.Get(image.ID(c.ParentImageID))
if err != nil {
return "", err
}
}
l, err := layerStore.Register(rwTar, parent.RootFS.ChainID())
if err != nil {
return "", err
}
defer layer.ReleaseAndLog(layerStore, l)
cc := image.ChildConfig{
ContainerID: c.ContainerID,
Author: c.Author,
Comment: c.Comment,
ContainerConfig: c.ContainerConfig,
Config: c.Config,
DiffID: l.DiffID(),
}
config, err := json.Marshal(image.NewChildImage(parent, cc, c.ContainerOS))
if err != nil {
return "", err
}
id, err := i.imageStore.Create(config)
if err != nil {
return "", err
}
if c.ParentImageID != "" {
if err := i.imageStore.SetParent(id, image.ID(c.ParentImageID)); err != nil {
return "", err
}
}
return id, nil
}
func exportContainerRw(layerStore layer.Store, id, mountLabel string) (arch io.ReadCloser, err error) {
rwlayer, err := layerStore.GetRWLayer(id)
if err != nil {
return nil, err
}
defer func() {
if err != nil {
layerStore.ReleaseRWLayer(rwlayer)
}
}()
// TODO: this mount call is not necessary as we assume that TarStream() should
// mount the layer if needed. But the Diff() function for windows requests that
// the layer should be mounted when calling it. So we reserve this mount call
// until windows driver can implement Diff() interface correctly.
_, err = rwlayer.Mount(mountLabel)
if err != nil {
return nil, err
}
archive, err := rwlayer.TarStream()
if err != nil {
rwlayer.Unmount()
return nil, err
}
return ioutils.NewReadCloserWrapper(archive, func() error {
archive.Close()
err = rwlayer.Unmount()
layerStore.ReleaseRWLayer(rwlayer)
return err
}),
nil
}
// CommitBuildStep is used by the builder to create an image for each step in
// the build.
//
// This method is different from CreateImageFromContainer:
// * it doesn't attempt to validate container state
// * it doesn't send a commit action to metrics
// * it doesn't log a container commit event
//
// This is a temporary shim. Should be removed when builder stops using commit.
func (i *ImageService) CommitBuildStep(c backend.CommitConfig) (image.ID, error) {
container := i.containers.Get(c.ContainerID)
if container == nil {
// TODO: use typed error
return "", errors.Errorf("container not found: %s", c.ContainerID)
}
c.ContainerMountLabel = container.MountLabel
c.ContainerOS = container.OS
c.ParentImageID = string(container.ImageID)
return i.CommitImage(c)
}
``` |
Susan Dunlap (born June 20, 1943) is an American writer of mystery novels and short stories. Her novels have mostly appeared in one of four series, each with its own sleuthing protagonist: Vejay Haskell, Jill Smith, Kiernan O'Shaughnessy, or Darcy Lott. Through 2020, more than two dozen of Dunlap's book-length mysteries have appeared in print. She has also edited crime fiction and has contributed to anthologies, including A Woman's Eye (1991), and to periodicals such as Ellery Queen's Mystery Magazine and Alfred Hitchcock's Mystery Magazine. Her short story "Checkout" won a Macavity Award and an Anthony Award in 1994.
Dunlap was a founding member of Sisters in Crime and served as its president in 1990–91. Before becoming a full-time writer in 1984, she was a social worker in Baltimore (1966–67), New York City (1967), and Contra Costa County, California (1968–84). She has also worked as a paralegal, private investigator, and yoga teacher.
Personal life
Born in Kew Gardens, Queens, New York, Dunlap graduated from Bucknell University with a B.A. in 1965 and from the University of North Carolina with a Master of Arts in Teaching in 1966. She married Newell Dunlap in 1970. In 2020, the Dunlaps live near San Francisco.
Critical reception
Carol M. Harper in St. James Guide to Crime and Mystery Writers said in 1996 that "Dunlap has coupled authenticity in setting with a bizarre sense of humor appropriate for Northern California. Her series feature radically different heroines (amateur detective, police officer and licensed private detective) from three different backgrounds (rural northern California, urban northern California, and East Coast transplant to urban Southern California) to create three eminently readable series." Harper also praised Dunlap for her abilities as a writer of short stories and an editor of crime-story anthologies.
Kirkus Reviews praises Time Expired, featuring Berkeley, California, homicide detective Jill Smith, as "an adroitly plotted, consistently interesting police procedural."
Bibliography
Mystery series
Vejay Haskell
An Equal Opportunity Death (1984)
The Bohemian Connection (1985)
The Last Annual Slugfest (1986)
Jill Smith
Karma (1981)
As a Favor (1984)
Not Exactly a Brahmin (1985)
Too Close to the Edge (1987)
A Dinner to Die For (1987)
Diamond in the Buff (1990)
Death and Taxes (1992)
Time Expired (1993)
Sudden Exposure (1996)
Cop Out (1997)
Kiernan O'Shaughnessy
Pious Deception (1989)
Rogue Wave (1991)
High Fall (1994)
No Immunity (1998)
Darcy Lott
A Single Eye (2006)
Hungry Ghosts (2008)
Civil Twilight (2009)
Power Slide (2010)
No Footprints (2012)
Switchback (2015)
Out of Nowhere (2016)
Short story collections
The Celestial Buffet and Other Morsels of Murder (2001)
Karma and Other Stories (2002)
No Safety and Other Short Stories (2014)
Other
Deadly Allies II: Private Eye Writers of America and Sisters in Crime Collaborative Anthology, editor, with Robert J. Randisi (1994)
Fast Friends (novel) (2004)
References
External links
1943 births
20th-century American women writers
21st-century American women writers
American crime fiction writers
Bucknell University alumni
University of North Carolina alumni
Writers from Queens, New York
Writers from San Francisco
Anthony Award winners
Macavity Award winners
Living people |
```c
/**
******************************************************************************
* @file system_stm32h7xx.c
* @author MCD Application Team
* @brief CMSIS Cortex-Mx Device Peripheral Access Layer System Source File.
*
* This file provides two functions and one global variable to be called from
* user application:
* - SystemInit(): This function is called at startup just after reset and
* before branch to main program. This call is made inside
* the "startup_stm32h7xx.s" file.
*
* - SystemCoreClock variable: Contains the core clock, it can be used
* by the user application to setup the SysTick
* timer or configure other parameters.
*
* - SystemCoreClockUpdate(): Updates the variable SystemCoreClock and must
* be called whenever the core clock is changed
* during program execution.
*
*
******************************************************************************
* @attention
*
* All rights reserved.</center></h2>
*
* This software component is licensed by ST under BSD 3-Clause license,
* opensource.org/licenses/BSD-3-Clause
*
******************************************************************************
*/
/** @addtogroup CMSIS
* @{
*/
/** @addtogroup stm32h7xx_system
* @{
*/
/** @addtogroup STM32H7xx_System_Private_Includes
* @{
*/
#include "stm32h7xx.h"
#include <math.h>
#if !defined (HSE_VALUE)
#define HSE_VALUE ((uint32_t)25000000) /*!< Value of the External oscillator in Hz */
#endif /* HSE_VALUE */
#if !defined (CSI_VALUE)
#define CSI_VALUE ((uint32_t)4000000) /*!< Value of the Internal oscillator in Hz*/
#endif /* CSI_VALUE */
#if !defined (HSI_VALUE)
#define HSI_VALUE ((uint32_t)64000000) /*!< Value of the Internal oscillator in Hz*/
#endif /* HSI_VALUE */
/**
* @}
*/
/** @addtogroup STM32H7xx_System_Private_TypesDefinitions
* @{
*/
/**
* @}
*/
/** @addtogroup STM32H7xx_System_Private_Defines
* @{
*/
/************************* Miscellaneous Configuration ************************/
/*!< Uncomment the following line if you need to use initialized data in D2 domain SRAM (AHB SRAM) */
/* #define DATA_IN_D2_SRAM */
/*!< Uncomment the following line if you need to relocate your vector Table in
Internal SRAM. */
/* #define VECT_TAB_SRAM */
#define VECT_TAB_OFFSET 0x00000000UL /*!< Vector Table base offset field.
This value must be a multiple of 0x200. */
/******************************************************************************/
/**
* @}
*/
/** @addtogroup STM32H7xx_System_Private_Macros
* @{
*/
/**
* @}
*/
/** @addtogroup STM32H7xx_System_Private_Variables
* @{
*/
/* This variable is updated in three ways:
1) by calling CMSIS function SystemCoreClockUpdate()
2) by calling HAL API function HAL_RCC_GetHCLKFreq()
3) each time HAL_RCC_ClockConfig() is called to configure the system clock frequency
Note: If you use this function to configure the system clock; then there
is no need to call the 2 first functions listed above, since SystemCoreClock
variable is updated automatically.
*/
uint32_t SystemCoreClock = 64000000;
uint32_t SystemD2Clock = 64000000;
const uint8_t D1CorePrescTable[16] = {0, 0, 0, 0, 1, 2, 3, 4, 1, 2, 3, 4, 6, 7, 8, 9};
/**
* @}
*/
/** @addtogroup STM32H7xx_System_Private_FunctionPrototypes
* @{
*/
/**
* @}
*/
/** @addtogroup STM32H7xx_System_Private_Functions
* @{
*/
/**
* @brief Setup the microcontroller system
* Initialize the FPU setting and vector table location
* configuration.
* @param None
* @retval None
*/
void SystemInit (void)
{
#if defined (DATA_IN_D2_SRAM)
__IO uint32_t tmpreg;
#endif /* DATA_IN_D2_SRAM */
/* FPU settings ------------------------------------------------------------*/
#if (__FPU_PRESENT == 1) && (__FPU_USED == 1)
SCB->CPACR |= ((3UL << (10*2))|(3UL << (11*2))); /* set CP10 and CP11 Full Access */
#endif
/* Reset the RCC clock configuration to the default reset state ------------*/
/* Increasing the CPU frequency */
if(FLASH_LATENCY_DEFAULT > (READ_BIT((FLASH->ACR), FLASH_ACR_LATENCY)))
{
/* Program the new number of wait states to the LATENCY bits in the FLASH_ACR register */
MODIFY_REG(FLASH->ACR, FLASH_ACR_LATENCY, (uint32_t)(FLASH_LATENCY_DEFAULT));
}
/* Set HSION bit */
RCC->CR |= RCC_CR_HSION;
/* Reset CFGR register */
RCC->CFGR = 0x00000000;
/* Reset HSEON, HSECSSON, CSION, HSI48ON, CSIKERON, PLL1ON, PLL2ON and PLL3ON bits */
RCC->CR &= 0xEAF6ED7FU;
/* Decreasing the number of wait states because of lower CPU frequency */
if(FLASH_LATENCY_DEFAULT < (READ_BIT((FLASH->ACR), FLASH_ACR_LATENCY)))
{
/* Program the new number of wait states to the LATENCY bits in the FLASH_ACR register */
MODIFY_REG(FLASH->ACR, FLASH_ACR_LATENCY, (uint32_t)(FLASH_LATENCY_DEFAULT));
}
#if defined(D3_SRAM_BASE)
/* Reset D1CFGR register */
RCC->D1CFGR = 0x00000000;
/* Reset D2CFGR register */
RCC->D2CFGR = 0x00000000;
/* Reset D3CFGR register */
RCC->D3CFGR = 0x00000000;
#else
/* Reset CDCFGR1 register */
RCC->CDCFGR1 = 0x00000000;
/* Reset CDCFGR2 register */
RCC->CDCFGR2 = 0x00000000;
/* Reset SRDCFGR register */
RCC->SRDCFGR = 0x00000000;
#endif
/* Reset PLLCKSELR register */
RCC->PLLCKSELR = 0x02020200;
/* Reset PLLCFGR register */
RCC->PLLCFGR = 0x01FF0000;
/* Reset PLL1DIVR register */
RCC->PLL1DIVR = 0x01010280;
/* Reset PLL1FRACR register */
RCC->PLL1FRACR = 0x00000000;
/* Reset PLL2DIVR register */
RCC->PLL2DIVR = 0x01010280;
/* Reset PLL2FRACR register */
RCC->PLL2FRACR = 0x00000000;
/* Reset PLL3DIVR register */
RCC->PLL3DIVR = 0x01010280;
/* Reset PLL3FRACR register */
RCC->PLL3FRACR = 0x00000000;
/* Reset HSEBYP bit */
RCC->CR &= 0xFFFBFFFFU;
/* Disable all interrupts */
RCC->CIER = 0x00000000;
#if (STM32H7_DEV_ID == 0x450UL)
/* dual core CM7 or single core line */
if((DBGMCU->IDCODE & 0xFFFF0000U) < 0x20000000U)
{
/* if stm32h7 revY*/
/* Change the switch matrix read issuing capability to 1 for the AXI SRAM target (Target 7) */
*((__IO uint32_t*)0x51008108) = 0x000000001U;
}
#endif
#if defined (DATA_IN_D2_SRAM)
/* in case of initialized data in D2 SRAM (AHB SRAM) , enable the D2 SRAM clock (AHB SRAM clock) */
#if defined(RCC_AHB2ENR_D2SRAM3EN)
RCC->AHB2ENR |= (RCC_AHB2ENR_D2SRAM1EN | RCC_AHB2ENR_D2SRAM2EN | RCC_AHB2ENR_D2SRAM3EN);
#elif defined(RCC_AHB2ENR_D2SRAM2EN)
RCC->AHB2ENR |= (RCC_AHB2ENR_D2SRAM1EN | RCC_AHB2ENR_D2SRAM2EN);
#else
RCC->AHB2ENR |= (RCC_AHB2ENR_AHBSRAM1EN | RCC_AHB2ENR_AHBSRAM2EN);
#endif /* RCC_AHB2ENR_D2SRAM3EN */
tmpreg = RCC->AHB2ENR;
(void) tmpreg;
#endif /* DATA_IN_D2_SRAM */
#if defined(DUAL_CORE) && defined(CORE_CM4)
/* Configure the Vector Table location add offset address for cortex-M4 ------------------*/
#ifdef VECT_TAB_SRAM
SCB->VTOR = D2_AXISRAM_BASE | VECT_TAB_OFFSET; /* Vector Table Relocation in Internal SRAM */
#else
SCB->VTOR = FLASH_BANK2_BASE | VECT_TAB_OFFSET; /* Vector Table Relocation in Internal FLASH */
#endif /* VECT_TAB_SRAM */
#else
/*
* Disable the FMC bank1 (enabled after reset).
* This, prevents CPU speculation access on this bank which blocks the use of FMC during
* 24us. During this time the others FMC master (such as LTDC) cannot use it!
*/
FMC_Bank1_R->BTCR[0] = 0x000030D2;
/* Configure the Vector Table location add offset address for cortex-M7 ------------------*/
#ifdef VECT_TAB_SRAM
SCB->VTOR = D1_AXISRAM_BASE | VECT_TAB_OFFSET; /* Vector Table Relocation in Internal AXI-RAM */
#else
SCB->VTOR = FLASH_BANK1_BASE | VECT_TAB_OFFSET; /* Vector Table Relocation in Internal FLASH */
#endif
#endif /*DUAL_CORE && CORE_CM4*/
}
/**
* @brief Update SystemCoreClock variable according to Clock Register Values.
* The SystemCoreClock variable contains the core clock , it can
* be used by the user application to setup the SysTick timer or configure
* other parameters.
*
* @note Each time the core clock changes, this function must be called
* to update SystemCoreClock variable value. Otherwise, any configuration
* based on this variable will be incorrect.
*
* @note - The system frequency computed by this function is not the real
* frequency in the chip. It is calculated based on the predefined
* constant and the selected clock source:
*
* - If SYSCLK source is CSI, SystemCoreClock will contain the CSI_VALUE(*)
* - If SYSCLK source is HSI, SystemCoreClock will contain the HSI_VALUE(**)
* - If SYSCLK source is HSE, SystemCoreClock will contain the HSE_VALUE(***)
* - If SYSCLK source is PLL, SystemCoreClock will contain the CSI_VALUE(*),
* HSI_VALUE(**) or HSE_VALUE(***) multiplied/divided by the PLL factors.
*
* (*) CSI_VALUE is a constant defined in stm32h7xx_hal.h file (default value
* 4 MHz) but the real value may vary depending on the variations
* in voltage and temperature.
* (**) HSI_VALUE is a constant defined in stm32h7xx_hal.h file (default value
* 64 MHz) but the real value may vary depending on the variations
* in voltage and temperature.
*
* (***)HSE_VALUE is a constant defined in stm32h7xx_hal.h file (default value
* 25 MHz), user has to ensure that HSE_VALUE is same as the real
* frequency of the crystal used. Otherwise, this function may
* have wrong result.
*
* - The result of this function could be not correct when using fractional
* value for HSE crystal.
* @param None
* @retval None
*/
void SystemCoreClockUpdate (void)
{
uint32_t pllp, pllsource, pllm, pllfracen, hsivalue, tmp;
uint32_t common_system_clock;
float_t fracn1, pllvco;
/* Get SYSCLK source -------------------------------------------------------*/
switch (RCC->CFGR & RCC_CFGR_SWS)
{
case RCC_CFGR_SWS_HSI: /* HSI used as system clock source */
common_system_clock = (uint32_t) (HSI_VALUE >> ((RCC->CR & RCC_CR_HSIDIV)>> 3));
break;
case RCC_CFGR_SWS_CSI: /* CSI used as system clock source */
common_system_clock = CSI_VALUE;
break;
case RCC_CFGR_SWS_HSE: /* HSE used as system clock source */
common_system_clock = HSE_VALUE;
break;
case RCC_CFGR_SWS_PLL1: /* PLL1 used as system clock source */
/* PLL_VCO = (HSE_VALUE or HSI_VALUE or CSI_VALUE/ PLLM) * PLLN
SYSCLK = PLL_VCO / PLLR
*/
pllsource = (RCC->PLLCKSELR & RCC_PLLCKSELR_PLLSRC);
pllm = ((RCC->PLLCKSELR & RCC_PLLCKSELR_DIVM1)>> 4) ;
pllfracen = ((RCC->PLLCFGR & RCC_PLLCFGR_PLL1FRACEN)>>RCC_PLLCFGR_PLL1FRACEN_Pos);
fracn1 = (float_t)(uint32_t)(pllfracen* ((RCC->PLL1FRACR & RCC_PLL1FRACR_FRACN1)>> 3));
if (pllm != 0U)
{
switch (pllsource)
{
case RCC_PLLCKSELR_PLLSRC_HSI: /* HSI used as PLL clock source */
hsivalue = (HSI_VALUE >> ((RCC->CR & RCC_CR_HSIDIV)>> 3)) ;
pllvco = ( (float_t)hsivalue / (float_t)pllm) * ((float_t)(uint32_t)(RCC->PLL1DIVR & RCC_PLL1DIVR_N1) + (fracn1/(float_t)0x2000) +(float_t)1 );
break;
case RCC_PLLCKSELR_PLLSRC_CSI: /* CSI used as PLL clock source */
pllvco = ((float_t)CSI_VALUE / (float_t)pllm) * ((float_t)(uint32_t)(RCC->PLL1DIVR & RCC_PLL1DIVR_N1) + (fracn1/(float_t)0x2000) +(float_t)1 );
break;
case RCC_PLLCKSELR_PLLSRC_HSE: /* HSE used as PLL clock source */
pllvco = ((float_t)HSE_VALUE / (float_t)pllm) * ((float_t)(uint32_t)(RCC->PLL1DIVR & RCC_PLL1DIVR_N1) + (fracn1/(float_t)0x2000) +(float_t)1 );
break;
default:
hsivalue = (HSI_VALUE >> ((RCC->CR & RCC_CR_HSIDIV)>> 3)) ;
pllvco = ((float_t)hsivalue / (float_t)pllm) * ((float_t)(uint32_t)(RCC->PLL1DIVR & RCC_PLL1DIVR_N1) + (fracn1/(float_t)0x2000) +(float_t)1 );
break;
}
pllp = (((RCC->PLL1DIVR & RCC_PLL1DIVR_P1) >>9) + 1U ) ;
common_system_clock = (uint32_t)(float_t)(pllvco/(float_t)pllp);
}
else
{
common_system_clock = 0U;
}
break;
default:
common_system_clock = (uint32_t) (HSI_VALUE >> ((RCC->CR & RCC_CR_HSIDIV)>> 3));
break;
}
/* Compute SystemClock frequency --------------------------------------------------*/
#if defined (RCC_D1CFGR_D1CPRE)
tmp = D1CorePrescTable[(RCC->D1CFGR & RCC_D1CFGR_D1CPRE)>> RCC_D1CFGR_D1CPRE_Pos];
/* common_system_clock frequency : CM7 CPU frequency */
common_system_clock >>= tmp;
/* SystemD2Clock frequency : CM4 CPU, AXI and AHBs Clock frequency */
SystemD2Clock = (common_system_clock >> ((D1CorePrescTable[(RCC->D1CFGR & RCC_D1CFGR_HPRE)>> RCC_D1CFGR_HPRE_Pos]) & 0x1FU));
#else
tmp = D1CorePrescTable[(RCC->CDCFGR1 & RCC_CDCFGR1_CDCPRE)>> RCC_CDCFGR1_CDCPRE_Pos];
/* common_system_clock frequency : CM7 CPU frequency */
common_system_clock >>= tmp;
/* SystemD2Clock frequency : AXI and AHBs Clock frequency */
SystemD2Clock = (common_system_clock >> ((D1CorePrescTable[(RCC->CDCFGR1 & RCC_CDCFGR1_HPRE)>> RCC_CDCFGR1_HPRE_Pos]) & 0x1FU));
#endif
#if defined(DUAL_CORE) && defined(CORE_CM4)
SystemCoreClock = SystemD2Clock;
#else
SystemCoreClock = common_system_clock;
#endif /* DUAL_CORE && CORE_CM4 */
}
/**
* @}
*/
/**
* @}
*/
/**
* @}
*/
/************************ (C) COPYRIGHT STMicroelectronics *****END OF FILE****/
``` |
Charles R. Holbrook III (born September 1938) was an American politician in the state of Kentucky. He served in the Kentucky House of Representatives as a Republican from 1972 to 1988.
References
1938 births
Living people
Republican Party members of the Kentucky House of Representatives
People from Ashland, Kentucky |
```c++
// filesys.cpp -- <experimental/filesystem> implementation
// (see filesystem.cpp for C++17 <filesystem> implementation)
#define _SILENCE_EXPERIMENTAL_FILESYSTEM_DEPRECATION_WARNING
#include <yvals.h>
#include "awint.h"
#include <direct.h>
#include <experimental/filesystem>
#include <io.h>
#include <string.h>
#include <Windows.h>
_FS_BEGIN
static file_type _Map_mode(int _Mode) { // map Windows file attributes to file_status
constexpr int _File_attribute_regular =
FILE_ATTRIBUTE_ARCHIVE | FILE_ATTRIBUTE_COMPRESSED | FILE_ATTRIBUTE_ENCRYPTED | FILE_ATTRIBUTE_HIDDEN
| FILE_ATTRIBUTE_NORMAL | FILE_ATTRIBUTE_NOT_CONTENT_INDEXED | FILE_ATTRIBUTE_OFFLINE | FILE_ATTRIBUTE_READONLY
| FILE_ATTRIBUTE_SPARSE_FILE | FILE_ATTRIBUTE_SYSTEM | FILE_ATTRIBUTE_TEMPORARY;
if ((_Mode & FILE_ATTRIBUTE_DIRECTORY) != 0) {
return file_type::directory;
} else if ((_Mode & _File_attribute_regular) != 0) {
return file_type::regular;
} else {
return file_type::unknown;
}
}
_FS_DLL void __CLRCALL_PURE_OR_CDECL _Close_dir(void* _Handle) { // close a directory
FindClose((HANDLE) _Handle);
}
// DIRECTORY FUNCTIONS
static wchar_t* _Strcpy(wchar_t (&_Dest)[_MAX_FILESYS_NAME], const wchar_t* _Src) { // copy an NTCTS
::wcscpy_s(_Dest, _MAX_FILESYS_NAME, _Src);
return _Dest;
}
static HANDLE _FilesysOpenFile(const wchar_t* _Fname, DWORD _Desired_access, DWORD _Flags) {
#if defined(_CRT_APP)
CREATEFILE2_EXTENDED_PARAMETERS _Create_file_parameters = {};
_Create_file_parameters.dwSize = sizeof(_Create_file_parameters);
_Create_file_parameters.dwFileFlags = _Flags;
return CreateFile2(_Fname, _Desired_access, FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE, OPEN_EXISTING,
&_Create_file_parameters);
#else // defined(_CRT_APP)
return CreateFileW(
_Fname, _Desired_access, FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE, 0, OPEN_EXISTING, _Flags, 0);
#endif // defined(_CRT_APP)
}
_FS_DLL wchar_t* __CLRCALL_PURE_OR_CDECL _Read_dir(
wchar_t (&_Dest)[_MAX_FILESYS_NAME], void* _Handle, file_type& _Ftype) { // read a directory entry
WIN32_FIND_DATAW _Dentry;
for (;;) {
if (FindNextFileW((HANDLE) _Handle, &_Dentry) == 0) { // fail
_Ftype = file_type::unknown;
return _Strcpy(_Dest, L"");
}
if (_Dentry.cFileName[0] != L'.'
|| (_Dentry.cFileName[1] != L'\0'
&& (_Dentry.cFileName[1] != L'.'
|| _Dentry.cFileName[2] != L'\0'))) { // not "." or "..", get file type and return name
_Ftype = _Map_mode(_Dentry.dwFileAttributes);
return _Strcpy(_Dest, &_Dentry.cFileName[0]);
}
}
}
static unsigned int _Filesys_code_page() { // determine appropriate code page
#if defined(_ONECORE)
return CP_ACP;
#else // defined(_ONECORE)
if (AreFileApisANSI()) {
return CP_ACP;
} else {
return CP_OEMCP;
}
#endif // defined(_ONECORE)
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _To_wide(const char* _Bsrc, wchar_t* _Wdest) {
// return nonzero on success
return MultiByteToWideChar(_Filesys_code_page(), 0, _Bsrc, -1, _Wdest, _MAX_FILESYS_NAME);
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _To_byte(const wchar_t* _Wsrc, char* _Bdest) {
// return nonzero on success
return WideCharToMultiByte(_Filesys_code_page(), 0, _Wsrc, -1, _Bdest, _MAX_FILESYS_NAME, nullptr, nullptr);
}
_FS_DLL void* __CLRCALL_PURE_OR_CDECL _Open_dir(
wchar_t (&_Dest)[_MAX_FILESYS_NAME], const wchar_t* _Dirname, int& _Errno, file_type& _Ftype) {
// open a directory for reading
WIN32_FIND_DATAW _Dentry;
wstring _Wildname(_Dirname);
if (!_Wildname.empty()) {
_Wildname.append(L"\\*");
}
void* _Handle =
FindFirstFileExW(_Wildname.c_str(), FindExInfoStandard, &_Dentry, FindExSearchNameMatch, nullptr, 0);
if (_Handle == INVALID_HANDLE_VALUE) { // report failure
_Errno = ERROR_BAD_PATHNAME;
*_Dest = L'\0';
return 0;
}
// success, get first directory entry
_Errno = 0;
if (_Dentry.cFileName[0] == L'.'
&& (_Dentry.cFileName[1] == L'\0'
|| _Dentry.cFileName[1] == L'.' && _Dentry.cFileName[2] == L'\0')) { // skip "." and ".."
_Read_dir(_Dest, _Handle, _Ftype);
if (_Dest[0] != L'\0') {
return _Handle;
}
// no entries, release handle
_Close_dir(_Handle);
return 0;
}
// get file type and return handle
_Strcpy(_Dest, &_Dentry.cFileName[0]);
_Ftype = _Map_mode(_Dentry.dwFileAttributes);
return _Handle;
}
_FS_DLL bool __CLRCALL_PURE_OR_CDECL _Current_get(wchar_t (&_Dest)[_MAX_FILESYS_NAME]) {
// get current working directory
_Strcpy(_Dest, L"");
#if defined(_CRT_APP)
return false; // no support
#else // defined(_CRT_APP)
return _wgetcwd(_Dest, _MAX_FILESYS_NAME) != 0;
#endif // defined(_CRT_APP)
}
_FS_DLL bool __CLRCALL_PURE_OR_CDECL _Current_set(const wchar_t* _Dirname) {
// set current working directory
#if defined(_CRT_APP)
(void) _Dirname;
return false; // no support
#else // defined(_CRT_APP)
return _wchdir(_Dirname) == 0;
#endif // defined(_CRT_APP)
}
_FS_DLL wchar_t* __CLRCALL_PURE_OR_CDECL _Symlink_get(wchar_t (&_Dest)[_MAX_FILESYS_NAME], const wchar_t*) {
// get symlink -- DUMMY
_Dest[0] = wchar_t(0);
return &_Dest[0];
}
_FS_DLL wchar_t* __CLRCALL_PURE_OR_CDECL _Temp_get(wchar_t (&_Dest)[_MAX_FILESYS_NAME]) {
// get temp directory
wchar_t _Dentry[MAX_PATH];
return _Strcpy(_Dest, GetTempPathW(MAX_PATH, &_Dentry[0]) == 0 ? L"." : &_Dentry[0]);
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Make_dir(const wchar_t* _Fname, const wchar_t*) {
// make a new directory (ignore attributes)
int _Ans = CreateDirectoryW(_Fname, 0);
if (_Ans != 0) {
return 1;
} else if (GetLastError() == ERROR_ALREADY_EXISTS) {
return 0;
} else {
return -1;
}
}
_FS_DLL bool __CLRCALL_PURE_OR_CDECL _Remove_dir(const wchar_t* _Fname) { // remove a directory
return _wrmdir(_Fname) != -1;
}
// FILE STATUS FUNCTIONS
_FS_DLL file_type __CLRCALL_PURE_OR_CDECL _Stat(const wchar_t* _Fname, perms* _Pmode) { // get file status
WIN32_FILE_ATTRIBUTE_DATA _Data;
if (GetFileAttributesExW(_Fname, GetFileExInfoStandard, &_Data)) {
// get file type and return permissions
if (_Pmode != 0) {
constexpr perms _Write_perms = perms::owner_write | perms::group_write | perms::others_write;
constexpr perms _Readonly_perms = perms::all & ~_Write_perms;
*_Pmode = _Data.dwFileAttributes & FILE_ATTRIBUTE_READONLY ? _Readonly_perms : perms::all;
}
return _Map_mode(_Data.dwFileAttributes);
}
// invalid, get error code
int _Errno = GetLastError();
if (_Errno == ERROR_BAD_NETPATH || _Errno == ERROR_BAD_PATHNAME || _Errno == ERROR_FILE_NOT_FOUND
|| _Errno == ERROR_INVALID_DRIVE || _Errno == ERROR_INVALID_NAME || _Errno == ERROR_INVALID_PARAMETER
|| _Errno == ERROR_PATH_NOT_FOUND) {
return file_type::not_found;
} else {
return file_type::unknown;
}
}
_FS_DLL file_type __CLRCALL_PURE_OR_CDECL _Lstat(const wchar_t* _Fname, perms* _Pmode) {
// get symlink file status
return _Stat(_Fname, _Pmode); // symlink not supported
}
_FS_DLL unsigned long long __CLRCALL_PURE_OR_CDECL _Hard_links(const wchar_t* _Fname) {
// get hard link count
HANDLE _Handle = _FilesysOpenFile(_Fname, FILE_READ_ATTRIBUTES, FILE_FLAG_BACKUP_SEMANTICS);
#if defined(_CRT_APP)
FILE_STANDARD_INFO _Info = {0};
bool _Ok = false;
if (_Handle != INVALID_HANDLE_VALUE) { // get file info
_Ok = GetFileInformationByHandleEx(_Handle, FileStandardInfo, &_Info, sizeof(_Info)) != 0;
CloseHandle(_Handle);
}
return _Ok ? _Info.NumberOfLinks : static_cast<unsigned long long>(-1);
#else // defined(_CRT_APP)
BY_HANDLE_FILE_INFORMATION _Info = {0};
bool _Ok = false;
if (_Handle != INVALID_HANDLE_VALUE) { // get file info
_Ok = GetFileInformationByHandle(_Handle, &_Info) != 0;
CloseHandle(_Handle);
}
return _Ok ? _Info.nNumberOfLinks : static_cast<unsigned long long>(-1);
#endif // defined(_CRT_APP)
}
_FS_DLL unsigned long long __CLRCALL_PURE_OR_CDECL _File_size(const wchar_t* _Fname) { // get file size
WIN32_FILE_ATTRIBUTE_DATA _Data;
if (!GetFileAttributesExW(_Fname, GetFileExInfoStandard, &_Data)) {
return static_cast<unsigned long long>(-1);
} else {
return static_cast<unsigned long long>(_Data.nFileSizeHigh) << 32 | _Data.nFileSizeLow;
}
}
// 3 centuries with 24 leap years each:
// 1600 is excluded, 1700/1800 are not leap years
// 1 partial century with 17 leap years:
// 1900 is not a leap year
// 1904 is leap year #1
// 1908 is leap year #2
// 1968 is leap year #17
constexpr uint64_t _Win_ticks_per_second = 10000000ULL;
constexpr uint64_t _Win_ticks_from_epoch = ((1970 - 1601) * 365 + 3 * 24 + 17) * 86400ULL * _Win_ticks_per_second;
_FS_DLL int64_t __CLRCALL_PURE_OR_CDECL _Last_write_time(const wchar_t* _Fname) { // get last write time
WIN32_FILE_ATTRIBUTE_DATA _Data;
if (!GetFileAttributesExW(_Fname, GetFileExInfoStandard, &_Data)) {
return -1;
}
// success, convert time
unsigned long long _Wtime = static_cast<unsigned long long>(_Data.ftLastWriteTime.dwHighDateTime) << 32
| _Data.ftLastWriteTime.dwLowDateTime;
return static_cast<int64_t>(_Wtime - _Win_ticks_from_epoch);
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Set_last_write_time(const wchar_t* _Fname, int64_t _When) {
// set last write time
HANDLE _Handle = _FilesysOpenFile(_Fname, FILE_WRITE_ATTRIBUTES, FILE_FLAG_BACKUP_SEMANTICS);
if (_Handle == INVALID_HANDLE_VALUE) {
return 0;
}
// convert to FILETIME and set
unsigned long long _Wtime = static_cast<unsigned long long>(_When) + _Win_ticks_from_epoch;
FILETIME _Ft;
_Ft.dwLowDateTime = static_cast<DWORD>(_Wtime); // intentionally discard upper bits
_Ft.dwHighDateTime = static_cast<DWORD>(_Wtime >> 32);
int _Result = SetFileTime(_Handle, nullptr, nullptr, &_Ft);
CloseHandle(_Handle);
return _Result;
}
_FS_DLL space_info __CLRCALL_PURE_OR_CDECL _Statvfs(const wchar_t* _Fname) {
// get space information for volume
space_info _Ans = {static_cast<uintmax_t>(-1), static_cast<uintmax_t>(-1), static_cast<uintmax_t>(-1)};
wstring _Devname = _Fname;
if (_Devname.empty() || _Devname.back() != L'/' && _Devname.back() != L'\\') {
_Devname.push_back(L'/');
}
_ULARGE_INTEGER _Available, _Capacity, _Free;
if (GetDiskFreeSpaceExW(_Devname.c_str(), &_Available, &_Capacity, &_Free)) { // convert values
_Ans.capacity = _Capacity.QuadPart;
_Ans.free = _Free.QuadPart;
_Ans.available = _Available.QuadPart;
}
return _Ans;
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Equivalent(
const wchar_t* _Fname1, const wchar_t* _Fname2) { // test for equivalent file names
#if defined(_CRT_APP)
_FILE_ID_INFO _Info1 = {0};
_FILE_ID_INFO _Info2 = {0};
bool _Ok1 = false;
bool _Ok2 = false;
HANDLE _Handle = _FilesysOpenFile(_Fname1, FILE_READ_ATTRIBUTES, FILE_FLAG_BACKUP_SEMANTICS);
if (_Handle != INVALID_HANDLE_VALUE) { // get file1 info
_Ok1 = GetFileInformationByHandleEx(_Handle, FileIdInfo, &_Info1, sizeof(_Info1)) != 0;
CloseHandle(_Handle);
}
_Handle = _FilesysOpenFile(_Fname2, FILE_READ_ATTRIBUTES, FILE_FLAG_BACKUP_SEMANTICS);
if (_Handle != INVALID_HANDLE_VALUE) { // get file2 info
_Ok2 = GetFileInformationByHandleEx(_Handle, FileIdInfo, &_Info2, sizeof(_Info2)) != 0;
CloseHandle(_Handle);
}
if (!_Ok1 && !_Ok2) {
return -1;
} else if (!_Ok1 || !_Ok2) {
return 0;
} else { // test existing files for equivalence
return _Info1.VolumeSerialNumber != _Info2.VolumeSerialNumber
|| memcmp(&_Info1.FileId, &_Info2.FileId, sizeof(_Info1.FileId)) != 0
? 0
: 1;
}
#else // defined(_CRT_APP)
BY_HANDLE_FILE_INFORMATION _Info1 = {0};
BY_HANDLE_FILE_INFORMATION _Info2 = {0};
bool _Ok1 = false;
bool _Ok2 = false;
HANDLE _Handle = _FilesysOpenFile(_Fname1, FILE_READ_ATTRIBUTES, FILE_FLAG_BACKUP_SEMANTICS);
if (_Handle != INVALID_HANDLE_VALUE) { // get file1 info
_Ok1 = GetFileInformationByHandle(_Handle, &_Info1) != 0;
CloseHandle(_Handle);
}
_Handle = _FilesysOpenFile(_Fname2, FILE_READ_ATTRIBUTES, FILE_FLAG_BACKUP_SEMANTICS);
if (_Handle != INVALID_HANDLE_VALUE) { // get file2 info
_Ok2 = GetFileInformationByHandle(_Handle, &_Info2) != 0;
CloseHandle(_Handle);
}
if (!_Ok1 && !_Ok2) {
return -1;
} else if (!_Ok1 || !_Ok2) {
return 0;
} else { // test existing files for equivalence
return _Info1.dwVolumeSerialNumber != _Info2.dwVolumeSerialNumber
|| _Info1.nFileIndexHigh != _Info2.nFileIndexHigh || _Info1.nFileIndexLow != _Info2.nFileIndexLow
? 0
: 1;
}
#endif // defined(_CRT_APP)
}
// FILE LINKAGE FUNCTIONS
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Link(const wchar_t* _Fname1, const wchar_t* _Fname2) {
// link _Fname2 to _Fname1
#if defined(_CRT_APP)
(void) _Fname1;
(void) _Fname2;
return errno = EDOM; // hardlinks not supported
#else // defined(_CRT_APP)
return CreateHardLinkW(_Fname2, _Fname1, 0) != 0 ? 0 : GetLastError();
#endif // defined(_CRT_APP)
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Symlink(const wchar_t* _Fname1, const wchar_t* _Fname2) {
// link _Fname2 to _Fname1
#if defined(_CRT_APP)
(void) _Fname1;
(void) _Fname2;
return errno = EDOM; // symlinks not supported
#else // defined(_CRT_APP)
return __crtCreateSymbolicLinkW(_Fname2, _Fname1, 0) != 0 ? 0 : GetLastError();
#endif // defined(_CRT_APP)
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Rename(const wchar_t* _Fname1, const wchar_t* _Fname2) {
// rename _Fname1 as _Fname2
return _wrename(_Fname1, _Fname2) == 0 ? 0 : GetLastError();
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Resize(const wchar_t* _Fname, uintmax_t _Newsize) { // change file size
bool _Ok = false;
HANDLE _Handle = _FilesysOpenFile(_Fname, FILE_GENERIC_WRITE, 0);
if (_Handle != INVALID_HANDLE_VALUE) { // set file pointer to new size and trim
LARGE_INTEGER _Large;
_Large.QuadPart = _Newsize;
_Ok = SetFilePointerEx(_Handle, _Large, 0, FILE_BEGIN) != 0 && SetEndOfFile(_Handle) != 0;
CloseHandle(_Handle);
}
return _Ok ? 0 : GetLastError();
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Unlink(const wchar_t* _Fname) { // unlink _Fname
return _wremove(_Fname) == 0 ? 0 : GetLastError();
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Copy_file(const wchar_t* _Fname1, const wchar_t* _Fname2) {
// copy _Fname1 to _Fname2
#if defined(_ONECORE)
COPYFILE2_EXTENDED_PARAMETERS _Params = {0};
_Params.dwSize = sizeof(COPYFILE2_EXTENDED_PARAMETERS);
_Params.dwCopyFlags = 0;
const HRESULT _Copy_result = CopyFile2(_Fname1, _Fname2, &_Params);
if (SUCCEEDED(_Copy_result)) {
return 0;
}
// take lower bits to undo HRESULT_FROM_WIN32
return _Copy_result & 0x0000FFFFU;
#else // defined(_ONECORE)
return CopyFileW(_Fname1, _Fname2, 0) != 0 ? 0 : GetLastError();
#endif // defined(_ONECORE)
}
_FS_DLL int __CLRCALL_PURE_OR_CDECL _Chmod(const wchar_t* _Fname, perms _Newmode) {
// change file mode to _Newmode
WIN32_FILE_ATTRIBUTE_DATA _Data;
if (!GetFileAttributesExW(_Fname, GetFileExInfoStandard, &_Data)) {
return -1;
}
// got mode, alter readonly bit
DWORD _Oldmode = _Data.dwFileAttributes;
DWORD _Mode = _Oldmode & ~FILE_ATTRIBUTE_READONLY;
constexpr perms _Write_perms = perms::owner_write | perms::group_write | perms::others_write;
if ((_Newmode & _Write_perms) == perms::none) {
_Mode |= FILE_ATTRIBUTE_READONLY;
}
return _Mode == _Oldmode ? 0 : SetFileAttributesW(_Fname, _Mode) != 0 ? 0 : -1;
}
_FS_END
``` |
```javascript
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
//
////////////////////////////////////////////////////////////////////////////////
const { FuzzedDataProvider } = require('@jazzer.js/core');
const Jimp = require('jimp');
const { writeFileSync } = require('fs');
module.exports.fuzz = async function(data) {
try {
const provider = new FuzzedDataProvider(data);
const content = provider.consumeBytes(provider.consumeIntegralInRange(1, 4096));
let jimpInput;
if (provider.consumeBoolean()) {
jimpInput = Buffer.from(content);
} else {
jimpInput = "/tmp/fuzz.me";
writeFileSync(jimpInput, Buffer.from(content));
}
Jimp.read(jimpInput, (err, image) => {
if (err) return;
const width = provider.consumeIntegralInRange(0, image.bitmap.width);
const height = provider.consumeIntegralInRange(0, image.bitmap.height);
const x = provider.consumeIntegralInRange(0, image.bitmap.width - width);
const y = provider.consumeIntegralInRange(0, image.bitmap.height - height);
const cropImage = image.crop(x, y, width, height);
const resizeWidth = provider.consumeIntegralInRange(0, image.bitmap.width);
const resizeHeight = provider.consumeIntegralInRange(0, image.bitmap.height);
const resizeImage = cropImage.resize(resizeWidth, resizeHeight);
const blurRadius = provider.consumeNumberinRange(0, 100);
const blurImage = resizeImage.blur(blurRadius);
const contrastValue = provider.consumeNumberinRange(-1, 1);
const contrastImage = blurImage.contrast(contrastValue);
const brightnessValue = provider.consumeNumberinRange(-1, 1);
const brightnessImage = contrastImage.brightness(brightnessValue);
const hueValue = provider.consumeNumberinRange(-1, 1);
const hueImage = brightnessImage.hue(hueValue);
const invertImage = hueImage.invert();
const greyscaleImage = invertImage.greyscale();
const sepiaImage = greyscaleImage.sepia();
const thresholdValue = provider.consumeNumberinRange(0, 1);
sepiaImage.threshold(thresholdValue);
const pixelColorX = provider.consumeIntegralInRange(0, image.bitmap.width);
const pixelColorY = provider.consumeIntegralInRange(0, image.bitmap.height);
image.getPixelColor(pixelColorX, pixelColorY);
const fontPath = pickRandom([
provider.consumeString(128),
Jimp.FONT_SANS_8_BLACK,
Jimp.FONT_SANS_10_BLACK,
Jimp.FONT_SANS_12_BLACK,
Jimp.FONT_SANS_14_BLACK,
Jimp.FONT_SANS_16_BLACK,
Jimp.FONT_SANS_32_BLACK,
Jimp.FONT_SANS_64_BLACK,
Jimp.FONT_SANS_128_BLACK,
Jimp.FONT_SANS_8_WHITE,
Jimp.FONT_SANS_16_WHITE,
Jimp.FONT_SANS_32_WHITE,
Jimp.FONT_SANS_64_WHITE,
Jimp.FONT_SANS_128_WHITE,
]);
const fontColor = provider.consumeNumber();
const fontSize = provider.consumeIntegralInRange(0, 100);
const fontX = provider.consumeIntegralInRange(0, image.bitmap.width);
const fontY = provider.consumeIntegralInRange(0, image.bitmap.height);
const text = provider.consumeString(10);
Jimp.loadFont(fontPath).then((font) => {
image.print(font, fontX, fontY, text, fontSize, fontColor);
});
const color = Jimp.color([
provider.consumeIntegralInRange(0, 256),
provider.consumeIntegralInRange(0, 256),
provider.consumeIntegralInRange(0, 256),
provider.consumeIntegralInRange(0, 256),
]);
image.color([
{ apply: 'hue', params: [provider.consumeNumberinRange(-1, 1)] },
{ apply: 'lighten', params: [provider.consumeNumberinRange(-1, 1)] },
{ apply: 'saturate', params: [provider.consumeNumberinRange(-1, 1)] },
{ apply: 'mix', params: [color, provider.consumeNumberinRange(0, 1)] },
]);
const filterType = pickRandom([
Jimp.AUTO,
Jimp.BLUR,
Jimp.SHARPEN,
Jimp.EDGE_DETECT,
Jimp.EMBOSS,
Jimp.GAUSSIAN,
]);
image.filter(filterType);
const kernel = [
[-1, -1, -1],
[-1, 9, -1],
[-1, -1, -1],
];
image.convolution(kernel);
const bufferType = pickRandom([
Jimp.MIME_PNG,
Jimp.MIME_JPEG,
Jimp.MIME_BMP,
Jimp.MIME_TIFF,
]);
image.getBuffer(bufferType);
const compositeImage = image.clone();
const compositeX = provider.consumeIntegralInRange(0, image.bitmap.width);
const compositeY = provider.consumeIntegralInRange(0, image.bitmap.height);
compositeImage.composite(image, compositeX, compositeY, {
mode: pickRandom([
Jimp.BLEND_SOURCE_OVER,
Jimp.BLEND_DESTINATION_OVER,
Jimp.BLEND_MULTIPLY,
Jimp.BLEND_ADD,
Jimp.BLEND_SCREEN,
Jimp.BLEND_OVERLAY,
Jimp.BLEND_DARKEN,
Jimp.BLEND_LIGHTEN,
Jimp.BLEND_HARDLIGHT,
Jimp.BLEND_DIFFERENCE,
Jimp.BLEND_EXCLUSION]),
opacitySource: provider.consumeNumberinRange(-1, 1),
opacityDest: provider.consumeNumberinRange(-1, 1),
});
const backgroundColor = Jimp.color([
provider.consumeIntegralInRange(0, 256),
provider.consumeIntegralInRange(0, 256),
provider.consumeIntegralInRange(0, 256),
provider.consumeIntegralInRange(0, 256),
]);
const backgroundX = provider.consumeIntegralInRange(0, image.bitmap.width);
const backgroundY = provider.consumeIntegralInRange(0, image.bitmap.height);
const backgroundWidth = provider.consumeIntegralInRange(0, image.bitmap.width - backgroundX);
const backgroundHeight = provider.consumeIntegralInRange(0, image.bitmap.height - backgroundY);
image.background(backgroundColor, backgroundX, backgroundY, backgroundWidth, backgroundHeight);
});
} catch (error) {
if (!ignoredError(error)) throw error;
}
};
function ignoredError(error) {
return Boolean(ignored.find((message) => error.message.indexOf(message) !== -1));
}
const ignored = [];
function pickRandom(array) {
return array[Math.floor(Math.random() * array.length)];
}
``` |
```python
#
#
#
# path_to_url
#
# Unless required by applicable law or agreed to in writing, software
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#
import torch
import time
import argparse
from ipex_llm.transformers import AutoModelForCausalLM
from transformers import AutoTokenizer
# prompt format referred from path_to_url
# and path_to_url#L7-L49
# For English prompt, you are recommended to change the prompt format.
BAICHUAN_PROMPT_FORMAT = "<reserved_106> {prompt} <reserved_107>"
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Predict Tokens using `generate()` API for Baichuan model')
parser.add_argument('--repo-id-or-model-path', type=str, default="baichuan-inc/Baichuan2-7B-Chat",
help='The huggingface repo id for the Baichuan model to be downloaded'
', or the path to the huggingface checkpoint folder')
parser.add_argument('--prompt', type=str, default="AI",
help='Prompt to infer')
parser.add_argument('--n-predict', type=int, default=32,
help='Max tokens to predict')
args = parser.parse_args()
model_path = args.repo_id_or_model_path
# Load model in 4 bit,
# which convert the relevant layers in the model into INT4 format
# if your selected model is capable of utilizing previous key/value attentions
# to enhance decoding speed, but has `"use_cache": false` in its model config,
# it is important to set `use_cache=True` explicitly in the `generate` function
# to obtain optimal performance with IPEX-LLM INT4 optimizations
# When running LLMs on Intel iGPUs for Windows users, we recommend setting `cpu_embedding=True` in the from_pretrained function.
# This will allow the memory-intensive embedding layer to utilize the CPU instead of iGPU.
model = AutoModelForCausalLM.from_pretrained(model_path,
load_in_4bit=True,
trust_remote_code=True,
use_cache=True)
model = model.half().to('xpu')
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_path,
trust_remote_code=True)
# Generate predicted tokens
with torch.inference_mode():
prompt = BAICHUAN_PROMPT_FORMAT.format(prompt=args.prompt)
input_ids = tokenizer.encode(prompt, return_tensors="pt").to('xpu')
# ipex_llm model needs a warmup, then inference time can be accurate
output = model.generate(input_ids,
max_new_tokens=args.n_predict)
# start inference
st = time.time()
output = model.generate(input_ids,
max_new_tokens=args.n_predict)
torch.xpu.synchronize()
end = time.time()
output = output.cpu()
output_str = tokenizer.decode(output[0], skip_special_tokens=True)
print(f'Inference time: {end-st} s')
print('-'*20, 'Prompt', '-'*20)
print(prompt)
print('-'*20, 'Output', '-'*20)
print(output_str)
``` |
```java
package com.yahoo.vespa.streamingvisitors.tracing;
import org.junit.jupiter.api.Test;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
public class SamplingTraceExporterTest {
@Test
void sampling_decision_is_deferred_to_provided_sampler() {
var exporter = mock(TraceExporter.class);
var sampler = mock(SamplingStrategy.class);
when(sampler.shouldSample()).thenReturn(true, false);
var samplingExporter = new SamplingTraceExporter(exporter, sampler);
samplingExporter.maybeExport(() -> new TraceDescription(null, ""));
verify(exporter, times(1)).maybeExport(any());
samplingExporter.maybeExport(() -> new TraceDescription(null, ""));
verify(exporter, times(1)).maybeExport(any()); // No further invocations since last
}
}
``` |
```html
<!DOCTYPE html>
<html lang="en">
<head>
<meta http-equiv="refresh" content="0;URL=struct.TermBuilder.html">
</head>
<body>
<p>Redirecting to <a href="struct.TermBuilder.html">struct.TermBuilder.html</a>...</p>
<script>location.replace("struct.TermBuilder.html" + location.search + location.hash);</script>
</body>
</html>
``` |
Bhaktha Jana () is a 1948 Indian Tamil language film directed and produced by P. Pullaiah. The film featured C. Honnappa Bhagavathar, V. Nagayya and Santha Kumari with K. Sarangapani and B. R. Panthulu playing supporting roles.
Plot
Janaka (Santhakumari) has been a staunch devotee of Panduranga since her childhood days. Her mother does not approve of it as she feels such blind devotion will adversely affect her daughter's marriage prospects and her future. Frustrated by her mother's attitude, Jana leaves her home and is found by Panduranga (C. Honnappa Bhagavathar) in the guise of a hermit. He tells her to devote herself to Panduranga by worshipping him daily at his temple. Jana does do so accordingly, much to the discomfort and anger of another devotee, Panthoji (B. R. Panthulu). He doesn't like Janaka coming to the temple and offering worship to Panduranga and orders his disciples to throw her out. She is saved by Namadeva (V. Nagayya) who provides her asylum in his home. One night, the jewellery in the temple goes missing and is found with Janaka. Panthoji and his disciples accuse Jana of thievery. When they open Panduranga's shrine, they are shocked to see that the idol is missing. The next moment, they see the idol along with the jewellery in her hands. Panthoji realises that she has the blessings of Panduranga and apologises to her for his accusations.
Cast
Adapted from Film News Anandan and The Hindu
C. Honnappa Bhagavathar as Panduranga
V. Nagayya as Namadeva
Santha Kumari as Janaka
K. Sarangapani
B. R. Panthulu as Panthoji
T. V. Kumudhini
T. N. Meenakshi
K. R. Chellam
Dance
Lalitha as Kubja
Padmini as Krishna
Production
Bhaktha Jana continued the trend of devotional films made in South Indian cinema from the 1930s and 1940s. P. Pullaiah directed and produced the film under his own banner Ragini Films.
Soundtrack
Reema–Narayanan and B. Narasimha Rao were in charge of the music and score for Bhaktha Jana while the songs' lyrics were written by Papanasam Sivan and Rajagopal Iyer.
Reception
Film historian Randor Guy notes that Bhaktha Jana is "remembered for the emotional story and fine on-screen narration by P. Pullaiah, and impressive performances by Shanthakumari, Panthulu and [Nagayya]".
References
1948 films
1940s Tamil-language films
Indian drama films
Indian black-and-white films
Films about reincarnation
Hindu mythological films
Hindu devotional films
Films directed by P. Pullayya
1948 drama films |
```xml
import {
Alert,
Button,
Header,
Modal,
SpaceBetween,
Spinner,
} from "@cloudscape-design/components";
import { ChangeEvent, useCallback, useRef, useState } from "react";
export interface FileUploadProps {
accept?: string[];
disabled: boolean | undefined;
onSubmit: (file: File) => Promise<void>;
}
function FileUpload({ disabled, onSubmit, accept = [] }: FileUploadProps) {
const [modalVisible, setModalVisible] = useState(false);
const [isLoading, setIsLoading] = useState(false);
const [selectedFile, setSelectedFile] = useState<File | null>(null);
const [error, setError] = useState<string | null>(null);
const fileInput = useRef<HTMLInputElement>(null);
const handleChange = useCallback(
({ target }: ChangeEvent<HTMLInputElement>) => {
if (target.files?.length) {
setSelectedFile(target.files[0]);
}
},
[setSelectedFile]
);
const handleDismiss = useCallback(() => {
setModalVisible(false);
setError(null);
setSelectedFile(null);
}, [setModalVisible, setError, setSelectedFile]);
const handleSelectFiles = useCallback(() => {
setError(null);
setSelectedFile(null);
fileInput.current?.click();
}, [setError, setSelectedFile, fileInput]);
const handleSubmit = useCallback(
async (e: React.FormEvent<HTMLFormElement>) => {
e.preventDefault();
setError(null);
const formData = new FormData(e.currentTarget);
const file = formData.get("file") as File | null;
if (!file) {
setError("No file selected.");
return;
}
try {
setIsLoading(true);
await onSubmit(file);
setModalVisible(false);
} catch (err) {
console.error("Upload failed.", err);
const message = (err as Error)?.message ?? "Upload failed.";
setError(message);
} finally {
setIsLoading(false);
}
},
[setError, setIsLoading, setModalVisible, onSubmit]
);
return (
<>
<Button disabled={disabled} onClick={() => setModalVisible(true)}>
Upload
</Button>
<Modal
visible={modalVisible}
header={<Header variant="h2">Upload file</Header>}
onDismiss={handleDismiss}
>
<SpaceBetween size="s">
{error && <Alert type="error">{error}</Alert>}
<form encType="multipart/form-data" onSubmit={handleSubmit}>
{selectedFile && (
<img
width={125}
src={URL.createObjectURL(selectedFile)}
alt="Image to upload"
/>
)}
<SpaceBetween size="s" direction="horizontal">
<Button
disabled={disabled}
formAction="none"
onClick={handleSelectFiles}
>
Select file
</Button>
<Button
disabled={isLoading || fileInput.current?.value.length === 0}
>
{isLoading ? <Spinner /> : "Upload"}
</Button>
</SpaceBetween>
<input
name="file"
ref={fileInput}
type="file"
accept={accept.join(",")}
onChange={handleChange}
hidden
/>
</form>
</SpaceBetween>
</Modal>
</>
);
}
export default FileUpload;
``` |
```java
/*
* DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
*
* This code is free software; you can redistribute it and/or modify it
* published by the Free Software Foundation. Oracle designates this
* particular file as subject to the "Classpath" exception as provided
* by Oracle in the LICENSE file that accompanied this code.
*
* This code is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
* version 2 for more details (a copy is included in the LICENSE file that
* accompanied this code).
*
* 2 along with this work; if not, write to the Free Software Foundation,
* Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
*
* Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
* or visit www.oracle.com if you need additional information or have any
* questions.
*/
/*
*/
package jdk.graal.compiler.jtt.except;
import org.junit.Test;
import jdk.graal.compiler.jtt.JTTTest;
public class BC_athrow0 extends JTTTest {
static Throwable throwable = new Throwable();
public static int test(int arg) throws Throwable {
if (arg == 2) {
throw throwable;
}
return arg;
}
@Test
public void run0() throws Throwable {
runTest("test", 0);
}
@Test
public void run1() throws Throwable {
runTest("test", 2);
}
}
``` |
Matrona is a genus of damselflies in the family Calopterygidae.
Species include:
Matrona basilaris
Matrona corephaea
Matrona cyanoptera
Matrona kricheldorffi
Matrona nigripectus
Matrona oberthueri
Matrona oreades
Matrona taoi
References
External links
Matrona at the Encyclopedia of Life
Calopterygidae
Zygoptera genera
Taxa named by Edmond de Sélys Longchamps |
```java
/*
This file is part of the iText (R) project.
Authors: Apryse Software.
This program is offered under a commercial and under the AGPL license.
For commercial licensing, contact us at path_to_url For AGPL licensing, see below.
AGPL licensing:
This program is free software: you can redistribute it and/or modify
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
along with this program. If not, see <path_to_url
*/
package com.itextpdf.io.font.constants;
public final class FontWeights {
private FontWeights() {
}
// Font weight Thin
public static final int THIN = 100;
// Font weight Extra-light (Ultra-light)
public static final int EXTRA_LIGHT = 200;
// Font weight Light
public static final int LIGHT = 300;
// Font weight Normal
public static final int NORMAL = 400;
// Font weight Medium
public static final int MEDIUM = 500;
// Font weight Semi-bold
public static final int SEMI_BOLD = 600;
// Font weight Bold
public static final int BOLD = 700;
// Font weight Extra-bold (Ultra-bold)
public static final int EXTRA_BOLD = 800;
// Font weight Black (Heavy)
public static final int BLACK = 900;
public static int fromType1FontWeight(String weight) {
int fontWeight = NORMAL;
switch (weight.toLowerCase()) {
case "ultralight":
fontWeight = THIN;
break;
case "thin":
case "extralight":
fontWeight = EXTRA_LIGHT;
break;
case "light":
fontWeight = LIGHT;
break;
case "book":
case "regular":
case "normal":
fontWeight = NORMAL;
break;
case "medium":
fontWeight = MEDIUM;
break;
case "demibold":
case "semibold":
fontWeight = SEMI_BOLD;
break;
case "bold":
fontWeight = BOLD;
break;
case "extrabold":
case "ultrabold":
fontWeight = EXTRA_BOLD;
break;
case "heavy":
case "black":
case "ultra":
case "ultrablack":
fontWeight = BLACK;
break;
case "fat":
case "extrablack":
fontWeight = BLACK;
break;
}
return fontWeight;
}
public static int normalizeFontWeight(int fontWeight) {
fontWeight = (fontWeight/100)*100;
if (fontWeight < FontWeights.THIN) return FontWeights.THIN;
if (fontWeight > FontWeights.BLACK) return FontWeights.BLACK;
return fontWeight;
}
}
``` |
Eighth Amendment may refer to:
Eighth Amendment to the United States Constitution, part of the United States Bill of Rights
Eighth Amendment of the Constitution of India, extended the period of reserved seats in the parliament
Eighth Amendment of the Constitution of Ireland, which recognized the equal right to life of an unborn child
Eighth Amendment to the Constitution of Pakistan, which changed Pakistan's government from a parliamentary system to a semi-presidential system
Eighth Amendment of the Constitution of South Africa, which allowed members of municipal councils to cross the floor from one political party to another without losing their seats |
```c++
#pragma once
/*
datetime.hpp
*/
/*
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. The name of the authors may not be used to endorse or promote products
derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#include <WinCompat.h>
#include "FARString.hpp"
DWORD ConvertYearToFull(DWORD ShortYear);
int GetDateFormat();
wchar_t GetDateSeparator();
wchar_t GetTimeSeparator();
inline int GetDateFormatDefault() { return 1; };
inline const wchar_t GetDateSeparatorDefault() { return L'-'; };
inline const wchar_t GetTimeSeparatorDefault() { return L':'; };
inline const wchar_t* GetDateSeparatorDefaultStr() { return L"-"; };
inline const wchar_t* GetTimeSeparatorDefaultStr() { return L":"; };
int64_t FileTimeDifference(const FILETIME *a, const FILETIME *b);
uint64_t FileTimeToUI64(const FILETIME *ft);
void GetFileDateAndTime(const wchar_t *Src, LPWORD Dst, size_t Count, int Separator);
void StrToDateTime(const wchar_t *CDate, const wchar_t *CTime, FILETIME &ft, int DateFormat,
int DateSeparator, int TimeSeparator, bool bRelative = false);
void ConvertDate(const FILETIME &ft, FARString &strDateText, FARString &strTimeText, int TimeLength,
int Brief = FALSE, int TextMonth = FALSE, int FullYear = 0, int DynInit = FALSE);
void ConvertDate_ResetInit();
void ConvertRelativeDate(const FILETIME &ft, FARString &strDaysText, FARString &strTimeText);
void PrepareStrFTime();
size_t WINAPI StrFTime(FARString &strDest, const wchar_t *Format, const tm *t);
size_t MkStrFTime(FARString &strDest, const wchar_t *Fmt = nullptr);
``` |
Castercliff is an Iron Age multivallate hillfort situated close to the towns of Nelson and Colne in Lancashire, Northern England.
It is located on a hilltop overlooking the valley system of the River Calder and its tributaries, on the western edge of the South Pennines. On the upper part of the hill, triple rubble ramparts up to high, separated by ditches of similar depth, surround the site on all sides except the north. On this side the defences consist mainly of a single rampart and ditch, but some short lengths of triple rampart and ditch are also found here. The inner rampart may have been timber-laced and revetted with stone and enclosed an oval area measuring approximately .
The summit of the hill is above sea level and the surrounding ground falls rapidly on all sides except the south east. Here a neck of land, dropping from the summit, connects it to similarly high ground about away. Streams spring from either side of the ridge and the deep valleys which they have cut, especially on the south, offer additional defence.
Excavations during the 1970s appear to show that the site was not completed, and no evidence of occupation was unearthed. However, in the past, evidence of roman occupation has been found and in 1898, Harry Speight was in no doubt that the site was the roman Colonio. The site is a Scheduled Ancient Monument.
The hillfort has been damaged by coal mining with old bell pits evident both inside and around the site.
Media gallery
See also
Scheduled monuments in Lancashire
References
External links
Aerial view and description
Hill forts in Lancashire
Scheduled monuments in Lancashire
Buildings and structures in the Borough of Pendle
Nelson, Lancashire |
```smalltalk
using System;
using sys = System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Atomix.CompilerExt;
using Atomix.CompilerExt.Attributes;
using Kernel_alpha.x86.Intrinsic;
namespace Atomix.mscorlib
{
public static class StringImpl
{
/*
Length Offset => [0x0C - 0x10)
Data Offset => [0x10 - )
*/
[Plug("System_Void__System_String__ctor_System_Char___")]
public static unsafe void ctor(byte* aFirstChar, char[] aChar)
{
ctor(aFirstChar, aChar, 0, aChar.Length);
}
[Plug("System_Void__System_String__ctor_System_Char__")]
public static unsafe void ctor(byte* aFirstChar, char* aChar)
{
int i = 0;
char* chars = (char*)(aFirstChar + 0x10);
while (*aChar != '\0')
{
*chars = *aChar;
++chars;
++aChar;
++i;
}
*((int*)(aFirstChar + 0xC)) = i;
}
[Plug(your_sha256_hashstem_Int32_")]
public static unsafe void ctor(byte* aFirstChar, char[] aChar, int Start, int Length)
{
int i;
char* chars = (char*)(aFirstChar + 0x10);
for (i = 0; i < Length; i++)
chars[i] = aChar[i + Start];
*((int*)(aFirstChar + 0xC)) = i;
}
[Plug("System_Char_System_String_get_Chars_System_Int32_")]
public static unsafe char Get_Chars(byte* aThis, int aIndex)
{
if (aIndex < 0 || aIndex >= Get_Length(aThis))
return '\0';
var xCharIdx = (char*)(aThis + 16);
return xCharIdx[aIndex];
}
[Plug("System_Int32_System_String_get_Length__")]
public unsafe static int Get_Length(byte* aThis)
{
var xCharIdx = (byte*)(aThis + 12);
return (int)(xCharIdx[3] << 24 | xCharIdx[2] << 16 | xCharIdx[1] << 8 | xCharIdx[0]);
}
[Plug(your_sha256_hash_System_String__System_String_")]
public static string Concat(string s0, string s1, string s2, string s3)
{
return ConcatArray(new string[] { s0, s1, s2, s3 }, s0.Length + s1.Length + s2.Length + s3.Length);
}
[Plug(your_sha256_hash_System_String_")]
public static string Concat(string s0, string s1, string s2)
{
return ConcatArray(new string[] { s0, s1, s2 }, s0.Length + s1.Length + s2.Length);
}
[Plug(your_sha256_hash)]
public static string Concat(string s0, string s1)
{
return ConcatArray(new string[] { s0, s1 }, s0.Length + s1.Length);
}
[Plug("System_String_System_String_Concat_System_String___")]
public static string Concat(params string[] strs)
{
int len= 0;
for (int i = 0; i < strs.Length; i++)
len += strs[i].Length;
return ConcatArray(strs, len);
}
private static string ConcatArray(string[] strs, int length)
{
char[] xResult = new char[length];
int p = 0;
for (int i = 0; i < strs.Length; i++)
{
var str = strs[i];
for (int j = 0; j < str.Length; j++)
{
xResult[p++] = str[j];
}
}
return new String(xResult);
}
[Plug(your_sha256_hash_")]
public static string SubString(string aThis, int index, int length)
{
char[] xResult = new char[length];
for (int i = 0; i < length; i++)
{
xResult[i] = aThis[index + i];
}
return new String(xResult);
}
[Plug("System_String_System_String_Substring_System_Int32_")]
public static string SubString(string aThis, int index)
{
return SubString(aThis, index, aThis.Length - index + 1);
}
[Plug("System_String_System_String_ToLower__")]
public static string ToLower(string aThis)
{
return ChangeCase(aThis, 65, 90, 32);
}
[Plug("System_String_System_String_ToUpper__")]
public static string ToUpper(string aThis)
{
return ChangeCase(aThis, 97, 122, -32);
}
[Plug("System_String_System_String_PadLeft_System_Int32__System_Char_")]
public static string PadLeft(string aThis, int TotalWidth, char paddingchar)
{
return Padding(aThis, TotalWidth, paddingchar, false);
}
[Plug("System_String_System_String_PadRight_System_Int32__System_Char_")]
public static string PadRight(string aThis, int TotalWidth, char paddingchar)
{
return Padding(aThis, TotalWidth, paddingchar, true);
}
[Label("getLength_System_Char__")]
public unsafe static int getLength(char* str)
{
int length = 0;
while (*str != '\0')
{
++str;
++length;
}
return length;
}
private static string Padding(string aThis, int TotalWidth, char PaddingChar, bool Direction)
{
var len = aThis.Length;
if (len >= TotalWidth)
return aThis;
char[] xResult = new char[TotalWidth];
if (Direction)
{
//Padding Right
for (int i = 0; i < TotalWidth; i++)
{
if (len <= i)
xResult[i] = PaddingChar;
else
xResult[i] = aThis[i];
}
}
else
{
var xOffset = TotalWidth - len;
//Padding Left
for (int i = 0; i < TotalWidth; i++)
{
if (i < xOffset)
xResult[i] = PaddingChar;
else
xResult[i] = aThis[i - xOffset];
}
}
return new String(xResult);
}
private static string ChangeCase(string xStr, int lowerACII, int upperASCII, int value)
{
char[] xResult = new char[xStr.Length];
for (int i = 0; i < xStr.Length; i++)
{
var xChar = xStr[i];
if (xChar >= lowerACII && xChar <= upperASCII)
{
xChar = (char)(xChar + value);
}
xResult[i] = xChar;
}
return new String(xResult);
}
[Plug("System_String___System_String_Split_System_Char___")]
public static string[] Split(string str, char[] c)
{
int counter = 0;
for (int i = 0; i < str.Length; i++)
{
if (str[i] == c[0])
{
counter++;
}
}
string[] xResult = new string[counter + 1];
char[] xTemp = new char[255];
int mcounter = 0;
int zcounter = 0;
for (int i = 0; i < str.Length; i++)
{
if (str[i] == c[0])
{
char[] xTemp2 = new char[mcounter + 1];
for (int j = 0; j < mcounter; j++)
{
xTemp2[j] = xTemp[j];
}
mcounter = 0;
xResult[zcounter] = new string(xTemp2);
zcounter++;
}
else
{
xTemp[mcounter] = str[i];
mcounter++;
if (i == str.Length - 1)
{
char[] xTemp2 = new char[mcounter + 1];
for (int j = 0; j < mcounter; j++)
{
xTemp2[j] = xTemp[j];
}
mcounter = 0;
xResult[zcounter] = new string(xTemp2);
zcounter++;
}
}
}
return xResult;
}
[Plug("System_String_System_String_Trim__")]
public static string Trim(string aThis)
{
int c = 0;
for (int i = 0; i < aThis.Length; i++)
{
if (aThis[i] == ' ')
break;
c++;
}
return aThis.Substring(0, c);
}
[Plug("System_String_System_String_Trim_System_Char___")]
public static string Trim(string aThis, char[] aChar)
{
/* Done it in very hurry, haha...so it do limited work */
int c = 0;
for (int i = 0; i < aThis.Length; i++)
{
if (aThis[i] == aChar[0])
break;
c++;
}
return aThis.Substring(0, c);
}
[Plug(your_sha256_hashtring_")]
public static bool Equality(string str1, string str2)
{
var len = str1.Length;
if (len != str2.Length)
return false;
for (int i = 0; i < len; i++)
{
if (str1[i] != str2[i])
return false;
}
return true;
}
}
}
``` |
Thomas Conecte (died 1434) was a French Carmelite friar and preacher.
Born at Rennes, Conecte travelled through Cambrai, Tournai, Arras, Flanders, and Picardy, his sermons vehemently denouncing the vices of the clergy and the extravagant dress of the women, especially their lofty head-dresses, or hennins. He ventured to teach that he who is a true servant of God need fear no papal curse, that the Roman Catholic hierarchy is corrupt, and that marriage is permissible to the clergy, of whom only some have the gift of continence. Having inveighed against the disedifying life of certain priests, he had to seek safety in flight and left France for Italy. He was listened to by immense congregations, and in Italy, despite the opposition of Nicholas Kenton (died 1468), provincial of the English Carmelites, he introduced several changes into the rules of that order. He introduced a strict observance in the convent near Florence, which gradually developed into the Congregation of Mantua. He visited this latter convent in 1432 and then proceeded to Venice, and finally to Rome, where the manners of the Curia provoked anew his violent language and occasioned a charge of conspiracy against the pope. He was finally apprehended by order of Pope Eugene IV, condemned and burnt for heresy.
An account of Friar Thomas's preaching and its effect is given by Enguerrand de Monstrelet, provost of Cambrai (died 1453), in his continuation of Froissart's chronicles.
References
1434 deaths
15th-century Breton people
Carmelites
Executed people from Brittany
French Christian monks
People executed by the Papal States by burning
People executed for heresy
Clergy from Rennes
Year of birth unknown |
```objective-c
#import <Foundation/Foundation.h>
@class GCDAsyncSocket;
@class WebSocket;
#if TARGET_OS_IPHONE
#if __IPHONE_OS_VERSION_MIN_REQUIRED >= 40000 // iPhone 4.0
#define IMPLEMENTED_PROTOCOLS <NSNetServiceDelegate>
#else
#define IMPLEMENTED_PROTOCOLS
#endif
#else
#if MAC_OS_X_VERSION_MIN_REQUIRED >= 1060 // Mac OS X 10.6
#define IMPLEMENTED_PROTOCOLS <NSNetServiceDelegate>
#else
#define IMPLEMENTED_PROTOCOLS
#endif
#endif
@interface HTTPServer : NSObject IMPLEMENTED_PROTOCOLS
{
// Underlying asynchronous TCP/IP socket
GCDAsyncSocket *asyncSocket;
// Dispatch queues
dispatch_queue_t serverQueue;
dispatch_queue_t connectionQueue;
void *IsOnServerQueueKey;
void *IsOnConnectionQueueKey;
// HTTP server configuration
NSString *documentRoot;
Class connectionClass;
NSString *interface;
UInt16 port;
// NSNetService and related variables
NSNetService *netService;
NSString *domain;
NSString *type;
NSString *name;
NSString *publishedName;
NSDictionary *txtRecordDictionary;
// Connection management
NSMutableArray *connections;
NSLock *connectionsLock;
BOOL isRunning;
}
/**
* Specifies the document root to serve files from.
* For example, if you set this to "/Users/<your_username>/Sites",
* then it will serve files out of the local Sites directory (including subdirectories).
*
* The default value is nil.
* The default server configuration will not serve any files until this is set.
*
* If you change the documentRoot while the server is running,
* the change will affect future incoming http connections.
**/
- (NSString *)documentRoot;
- (void)setDocumentRoot:(NSString *)value;
/**
* The connection class is the class used to handle incoming HTTP connections.
*
* The default value is [HTTPConnection class].
* You can override HTTPConnection, and then set this to [MyHTTPConnection class].
*
* If you change the connectionClass while the server is running,
* the change will affect future incoming http connections.
**/
- (Class)connectionClass;
- (void)setConnectionClass:(Class)value;
/**
* Set what interface you'd like the server to listen on.
* By default this is nil, which causes the server to listen on all available interfaces like en1, wifi etc.
*
* The interface may be specified by name (e.g. "en1" or "lo0") or by IP address (e.g. "192.168.4.34").
* You may also use the special strings "localhost" or "loopback" to specify that
* the socket only accept connections from the local machine.
**/
- (NSString *)interface;
- (void)setInterface:(NSString *)value;
/**
* The port number to run the HTTP server on.
*
* The default port number is zero, meaning the server will automatically use any available port.
* This is the recommended port value, as it avoids possible port conflicts with other applications.
* Technologies such as Bonjour can be used to allow other applications to automatically discover the port number.
*
* Note: As is common on most OS's, you need root privledges to bind to port numbers below 1024.
*
* You can change the port property while the server is running, but it won't affect the running server.
* To actually change the port the server is listening for connections on you'll need to restart the server.
*
* The listeningPort method will always return the port number the running server is listening for connections on.
* If the server is not running this method returns 0.
**/
- (UInt16)port;
- (UInt16)listeningPort;
- (void)setPort:(UInt16)value;
/**
* Bonjour domain for publishing the service.
* The default value is "local.".
*
* Note: Bonjour publishing requires you set a type.
*
* If you change the domain property after the bonjour service has already been published (server already started),
* you'll need to invoke the republishBonjour method to update the broadcasted bonjour service.
**/
- (NSString *)domain;
- (void)setDomain:(NSString *)value;
/**
* Bonjour name for publishing the service.
* The default value is "".
*
* If using an empty string ("") for the service name when registering,
* the system will automatically use the "Computer Name".
* Using an empty string will also handle name conflicts
* by automatically appending a digit to the end of the name.
*
* Note: Bonjour publishing requires you set a type.
*
* If you change the name after the bonjour service has already been published (server already started),
* you'll need to invoke the republishBonjour method to update the broadcasted bonjour service.
*
* The publishedName method will always return the actual name that was published via the bonjour service.
* If the service is not running this method returns nil.
**/
- (NSString *)name;
- (NSString *)publishedName;
- (void)setName:(NSString *)value;
/**
* Bonjour type for publishing the service.
* The default value is nil.
* The service will not be published via bonjour unless the type is set.
*
* If you wish to publish the service as a traditional HTTP server, you should set the type to be "_http._tcp.".
*
* If you change the type after the bonjour service has already been published (server already started),
* you'll need to invoke the republishBonjour method to update the broadcasted bonjour service.
**/
- (NSString *)type;
- (void)setType:(NSString *)value;
/**
* Republishes the service via bonjour if the server is running.
* If the service was not previously published, this method will publish it (if the server is running).
**/
- (void)republishBonjour;
/**
*
**/
- (NSDictionary *)TXTRecordDictionary;
- (void)setTXTRecordDictionary:(NSDictionary *)dict;
/**
* Attempts to starts the server on the configured port, interface, etc.
*
* If an error occurs, this method returns NO and sets the errPtr (if given).
* Otherwise returns YES on success.
*
* Some examples of errors that might occur:
* - You specified the server listen on a port which is already in use by another application.
* - You specified the server listen on a port number below 1024, which requires root priviledges.
*
* Code Example:
*
* NSError *err = nil;
* if (![httpServer start:&err])
* {
* NSLog(@"Error starting http server: %@", err);
* }
**/
- (BOOL)start:(NSError **)errPtr;
/**
* Stops the server, preventing it from accepting any new connections.
* You may specify whether or not you want to close the existing client connections.
*
* The default stop method (with no arguments) will close any existing connections. (It invokes [self stop:NO])
**/
- (void)stop;
- (void)stop:(BOOL)keepExistingConnections;
- (BOOL)isRunning;
- (NSUInteger)numberOfHTTPConnections;
@end
``` |
This list of earthquakes in Bulgaria is organized by date and includes events that caused injuries/fatalities, historic quakes, as well events that are notable for other reasons.
Earthquakes
Key
Epicenter outside Bulgaria
Gallery
See also
List of earthquakes in Romania
List of earthquakes in Italy
List of earthquakes in Croatia
External links
Seismic events in Bulgaria and surrounding regions, last 30 days - real time data from NIGGG-BAS
References
Bulgaria
Earthquakes |
Annabelle Cleeland is an Australian politician who is the current member for the district of Euroa in the Victorian Legislative Assembly. She is a member of the Nationals and was elected in the 2022 state election, replacing retiring MLA Steph Ryan.
References
Year of birth missing (living people)
Living people
Members of the Victorian Legislative Assembly
21st-century Australian women politicians
National Party of Australia members of the Parliament of Victoria
21st-century Australian politicians |
```java
/*
* DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
*
* This code is free software; you can redistribute it and/or modify it
* published by the Free Software Foundation. Oracle designates this
* particular file as subject to the "Classpath" exception as provided
* by Oracle in the LICENSE file that accompanied this code.
*
* This code is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
* version 2 for more details (a copy is included in the LICENSE file that
* accompanied this code).
*
* 2 along with this work; if not, write to the Free Software Foundation,
* Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
*
* Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
* or visit www.oracle.com if you need additional information or have any
* questions.
*/
package org.graalvm.visualvm.jmx.impl;
import com.sun.management.HotSpotDiagnosticMXBean;
import com.sun.management.VMOption;
import java.io.IOException;
import java.lang.management.LockInfo;
import java.lang.management.ManagementFactory;
import java.lang.management.MonitorInfo;
import java.lang.management.RuntimeMXBean;
import java.lang.management.ThreadInfo;
import java.lang.management.ThreadMXBean;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.text.SimpleDateFormat;
import java.util.Collections;
import java.util.Date;
import java.util.Map;
import java.util.Properties;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.management.InstanceNotFoundException;
import javax.management.MBeanException;
import javax.management.MBeanInfo;
import javax.management.MBeanOperationInfo;
import javax.management.MBeanServerConnection;
import javax.management.MalformedObjectNameException;
import javax.management.ObjectName;
import javax.management.ReflectionException;
import org.graalvm.visualvm.application.jvm.HeapHistogram;
import org.graalvm.visualvm.tools.jmx.JmxModel;
import org.graalvm.visualvm.tools.jmx.JmxModel.ConnectionState;
import org.graalvm.visualvm.tools.jmx.JvmMXBeans;
import org.graalvm.visualvm.tools.jmx.JvmMXBeansFactory;
import org.openide.ErrorManager;
import org.openide.util.Exceptions;
/**
*
* @author Tomas Hurka
*/
public class JmxSupport {
private final static Logger LOGGER = Logger.getLogger(JmxSupport.class.getName());
private static final String HOTSPOT_DIAGNOSTIC_MXBEAN_NAME =
"com.sun.management:type=HotSpotDiagnostic"; // NOI18N
private static final String DIAGNOSTIC_COMMAND_MXBEAN_NAME =
"com.sun.management:type=DiagnosticCommand"; // NOI18N
private static final String ALL_OBJECTS_OPTION = "-all"; // NOI18N
private static final String HISTOGRAM_COMMAND = "gcClassHistogram"; // NOI18N
private static final String CMDLINE_COMMAND = "vmCommandLine"; // NOI18N
private static final String CMDLINE_PREFIX = "java_command: "; // NOI18N
private static final String CMDLINE_EMPTY = "<unknown>"; // NOI18N
private JvmMXBeans mxbeans;
private JmxModel jmxModel;
// HotspotDiagnostic
private boolean hotspotDiagnosticInitialized;
private final Object hotspotDiagnosticLock = new Object();
private HotSpotDiagnosticMXBean hotspotDiagnosticMXBean;
private final Object readOnlyConnectionLock = new Object();
private Boolean readOnlyConnection;
private Boolean hasDumpAllThreads;
private final Object hasDumpAllThreadsLock = new Object();
private String commandLine;
private final Object commandLineLock = new Object();
JmxSupport(JmxModel jmx) {
jmxModel = jmx;
}
private RuntimeMXBean getRuntime() {
JvmMXBeans jmx = getJvmMXBeans();
if (jmx != null) {
return jmx.getRuntimeMXBean();
}
return null;
}
private synchronized JvmMXBeans getJvmMXBeans() {
if (mxbeans == null) {
if (jmxModel.getConnectionState() == ConnectionState.CONNECTED) {
mxbeans = JvmMXBeansFactory.getJvmMXBeans(jmxModel);
}
}
return mxbeans;
}
Properties getSystemProperties() {
try {
RuntimeMXBean runtime = getRuntime();
if (runtime != null) {
Properties prop = new Properties();
prop.putAll(runtime.getSystemProperties());
return prop;
}
return null;
} catch (Exception e) {
LOGGER.log(Level.INFO, "getSystemProperties", e); // NOI18N
return null;
}
}
boolean isReadOnlyConnection() {
synchronized (readOnlyConnectionLock) {
if (readOnlyConnection == null) {
readOnlyConnection = Boolean.FALSE;
ThreadMXBean threads = getThreadBean();
if (threads != null) {
try {
threads.getThreadInfo(1);
} catch (SecurityException ex) {
readOnlyConnection = Boolean.TRUE;
}
}
}
return readOnlyConnection.booleanValue();
}
}
ThreadMXBean getThreadBean() {
JvmMXBeans jmx = getJvmMXBeans();
if (jmx != null) {
return jmx.getThreadMXBean();
}
return null;
}
HotSpotDiagnosticMXBean getHotSpotDiagnostic() {
synchronized (hotspotDiagnosticLock) {
if (hotspotDiagnosticInitialized) {
return hotspotDiagnosticMXBean;
}
JvmMXBeans jmx = getJvmMXBeans();
if (jmx != null) {
try {
hotspotDiagnosticMXBean = jmx.getMXBean(
ObjectName.getInstance(HOTSPOT_DIAGNOSTIC_MXBEAN_NAME),
HotSpotDiagnosticMXBean.class);
} catch (MalformedObjectNameException e) {
ErrorManager.getDefault().log(ErrorManager.WARNING,
"Couldn't find HotSpotDiagnosticMXBean: " + // NOI18N
e.getLocalizedMessage());
} catch (IllegalArgumentException ex) {
ErrorManager.getDefault().notify(ErrorManager.INFORMATIONAL, ex);
}
}
hotspotDiagnosticInitialized = true;
return hotspotDiagnosticMXBean;
}
}
String takeThreadDump(long[] threadIds) {
try {
ThreadMXBean threadMXBean = getThreadBean();
if (threadMXBean == null) {
return null;
}
ThreadInfo[] threads;
StringBuilder sb = new StringBuilder(4096);
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); // NOI18N
if (hasDumpAllThreads()) {
threads = threadMXBean.getThreadInfo(threadIds, true, true);
} else {
threads = threadMXBean.getThreadInfo(threadIds, Integer.MAX_VALUE);
}
sb.append(df.format(new Date()) + "\n"); // NOI18N
printThreads(sb, threadMXBean, threads);
return sb.toString();
} catch (Exception e) {
LOGGER.log(Level.INFO, "takeThreadDump[]", e); // NOI18N
return null;
}
}
String takeThreadDump() {
try {
ThreadMXBean threadMXBean = getThreadBean();
if (threadMXBean == null) {
return null;
}
ThreadInfo[] threads;
Properties prop = getSystemProperties();
StringBuilder sb = new StringBuilder(4096);
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); // NOI18N
sb.append(df.format(new Date()) + "\n");
sb.append("Full thread dump " + prop.getProperty("java.vm.name") + // NOI18N
" (" + prop.getProperty("java.vm.version") + " " + // NOI18N
prop.getProperty("java.vm.info") + "):\n"); // NOI18N
if (hasDumpAllThreads()) {
threads = threadMXBean.dumpAllThreads(true, true);
} else {
long[] threadIds = threadMXBean.getAllThreadIds();
threads = threadMXBean.getThreadInfo(threadIds, Integer.MAX_VALUE);
}
printThreads(sb, threadMXBean, threads);
return sb.toString();
} catch (Exception e) {
LOGGER.log(Level.INFO,"takeThreadDump", e); // NOI18N
return null;
}
}
private void printThreads(final StringBuilder sb, final ThreadMXBean threadMXBean, ThreadInfo[] threads) {
boolean jdk16 = hasDumpAllThreads();
for (ThreadInfo thread : threads) {
if (thread != null) {
if (jdk16) {
print16Thread(sb, threadMXBean, thread);
} else {
print15Thread(sb, thread);
}
}
}
}
private void print16Thread(final StringBuilder sb, final ThreadMXBean threadMXBean, final ThreadInfo thread) {
MonitorInfo[] monitors = null;
if (threadMXBean.isObjectMonitorUsageSupported()) {
monitors = thread.getLockedMonitors();
}
sb.append("\n\"" + thread.getThreadName() + // NOI18N
"\" - Thread t@" + thread.getThreadId() + "\n"); // NOI18N
sb.append(" java.lang.Thread.State: " + thread.getThreadState()); // NOI18N
sb.append("\n"); // NOI18N
int index = 0;
for (StackTraceElement st : thread.getStackTrace()) {
LockInfo lock = thread.getLockInfo();
String lockOwner = thread.getLockOwnerName();
sb.append("\tat " + st.toString() + "\n"); // NOI18N
if (index == 0) {
if ("java.lang.Object".equals(st.getClassName()) && // NOI18N
"wait".equals(st.getMethodName())) { // NOI18N
if (lock != null) {
sb.append("\t- waiting on "); // NOI18N
printLock(sb,lock);
sb.append("\n"); // NOI18N
}
} else if (lock != null) {
if (lockOwner == null) {
sb.append("\t- parking to wait for "); // NOI18N
printLock(sb,lock);
sb.append("\n"); // NOI18N
} else {
sb.append("\t- waiting to lock "); // NOI18N
printLock(sb,lock);
sb.append(" owned by \""+lockOwner+"\" t@"+thread.getLockOwnerId()+"\n"); // NOI18N
}
}
}
printMonitors(sb, monitors, index);
index++;
}
StringBuilder jnisb = new StringBuilder();
printMonitors(jnisb, monitors, -1);
if (jnisb.length() > 0) {
sb.append(" JNI locked monitors:\n");
sb.append(jnisb);
}
if (threadMXBean.isSynchronizerUsageSupported()) {
sb.append("\n Locked ownable synchronizers:"); // NOI18N
LockInfo[] synchronizers = thread.getLockedSynchronizers();
if (synchronizers == null || synchronizers.length == 0) {
sb.append("\n\t- None\n"); // NOI18N
} else {
for (LockInfo li : synchronizers) {
sb.append("\n\t- locked "); // NOI18N
printLock(sb,li);
sb.append("\n"); // NOI18N
}
}
}
}
private void printMonitors(final StringBuilder sb, final MonitorInfo[] monitors, final int index) {
if (monitors != null) {
for (MonitorInfo mi : monitors) {
if (mi.getLockedStackDepth() == index) {
sb.append("\t- locked "); // NOI18N
printLock(sb,mi);
sb.append("\n"); // NOI18N
}
}
}
}
private void print15Thread(final StringBuilder sb, final ThreadInfo thread) {
sb.append("\n\"" + thread.getThreadName() + // NOI18N
"\" - Thread t@" + thread.getThreadId() + "\n"); // NOI18N
sb.append(" java.lang.Thread.State: " + thread.getThreadState()); // NOI18N
if (thread.getLockName() != null) {
sb.append(" on " + thread.getLockName()); // NOI18N
if (thread.getLockOwnerName() != null) {
sb.append(" owned by: " + thread.getLockOwnerName()); // NOI18N
}
}
sb.append("\n"); // NOI18N
for (StackTraceElement st : thread.getStackTrace()) {
sb.append(" at " + st.toString() + "\n"); // NOI18N
}
}
private void printLock(StringBuilder sb,LockInfo lock) {
String id = Integer.toHexString(lock.getIdentityHashCode());
String className = lock.getClassName();
sb.append("<"+id+"> (a "+className+")"); // NOI18N
}
boolean takeHeapDump(String fileName) {
HotSpotDiagnosticMXBean hsDiagnostic = getHotSpotDiagnostic();
if (hsDiagnostic != null) {
try {
hsDiagnostic.dumpHeap(fileName,true);
} catch (IOException ex) {
LOGGER.log(Level.INFO,"takeHeapDump", ex); // NOI18N
try {
Path f = Paths.get(fileName);
Files.deleteIfExists(f);
} catch (IOException ex1) {
LOGGER.log(Level.INFO,"takeHeapDump", ex1); // NOI18N
}
return false;
}
return true;
}
return false;
}
String getFlagValue(String name) {
try {
HotSpotDiagnosticMXBean hsDiagnostic = getHotSpotDiagnostic();
if (hsDiagnostic != null) {
VMOption option = hsDiagnostic.getVMOption(name);
if (option != null) {
return option.getValue();
}
}
return null;
} catch (IllegalArgumentException ex) {
// non-existing VM option
LOGGER.log(Level.FINE, "getFlagValue", ex); // NOI18N
return null;
} catch (Exception ex) {
LOGGER.log(Level.INFO, "getFlagValue", ex); // NOI18N
return null;
}
}
HeapHistogram takeHeapHistogram() {
if (isReadOnlyConnection()) return null;
String histo = executeJCmd(HISTOGRAM_COMMAND, Collections.singletonMap(ALL_OBJECTS_OPTION, null));
if (histo != null) {
return new HeapHistogramImpl(histo);
}
return null;
}
void setFlagValue(String name, String value) {
try {
HotSpotDiagnosticMXBean hsDiagnostic = getHotSpotDiagnostic();
if (hsDiagnostic != null) {
hsDiagnostic.setVMOption(name,value);
}
} catch (Exception ex) {
LOGGER.log(Level.INFO,"setFlagValue", ex); // NOI18N
}
}
private boolean hasDumpAllThreads() {
synchronized (hasDumpAllThreadsLock) {
if (hasDumpAllThreads == null) {
hasDumpAllThreads = Boolean.FALSE;
try {
ObjectName threadObjName = new ObjectName(ManagementFactory.THREAD_MXBEAN_NAME);
MBeanInfo threadInfo = jmxModel.getMBeanServerConnection().getMBeanInfo(threadObjName);
if (threadInfo != null) {
for (MBeanOperationInfo op : threadInfo.getOperations()) {
if ("dumpAllThreads".equals(op.getName())) {
hasDumpAllThreads = Boolean.TRUE;
}
}
}
} catch (Exception ex) {
LOGGER.log(Level.INFO,"hasDumpAllThreads", ex); // NOI18N
}
}
return hasDumpAllThreads.booleanValue();
}
}
String getCommandLine() {
synchronized (commandLineLock) {
if (commandLine == null) {
String vmCommandLine = executeJCmd(CMDLINE_COMMAND);
if (vmCommandLine != null) {
commandLine = parseVMCommandLine(vmCommandLine);
}
}
return commandLine;
}
}
private String executeJCmd(String command) {
return executeJCmd(command, Collections.emptyMap());
}
String executeJCmd(String command, Map<String,Object> pars) {
if (jmxModel.getConnectionState() == ConnectionState.CONNECTED) {
MBeanServerConnection conn = jmxModel.getMBeanServerConnection();
try {
ObjectName diagCommName = new ObjectName(DIAGNOSTIC_COMMAND_MXBEAN_NAME);
if (conn.isRegistered(diagCommName)) {
Object[] params = null;
String[] signature = null;
Object ret;
if (!pars.isEmpty()) {
params = new Object[] {getJCmdParams(pars)};
signature = new String[] {String[].class.getName()};
}
ret = conn.invoke(diagCommName, command, params, signature);
if (ret instanceof String) {
return (String)ret;
}
}
} catch (MalformedObjectNameException ex) {
Exceptions.printStackTrace(ex);
} catch (IOException ex) {
LOGGER.log(Level.INFO,"executeJCmd", ex); // NOI18N
} catch (InstanceNotFoundException ex) {
Exceptions.printStackTrace(ex);
} catch (MBeanException ex) {
Exceptions.printStackTrace(ex);
} catch (ReflectionException ex) {
Exceptions.printStackTrace(ex);
}
}
return null;
}
private String parseVMCommandLine(String vmCommandLine) {
String[] lines = vmCommandLine.split("\\r?\\n");
for (String line : lines) {
if (line.startsWith(CMDLINE_PREFIX)) {
String cmdline = line.substring(CMDLINE_PREFIX.length());
if (CMDLINE_EMPTY.equals(cmdline)) return "";
return cmdline;
}
}
return null;
}
private static String[] getJCmdParams(Map<String, Object> pars) {
String[] jcmdParams = new String[pars.size()];
int i = 0;
for (Map.Entry<String,Object> e : pars.entrySet()) {
String par;
String key = e.getKey();
Object val = e.getValue();
if (val == null) {
par = key;
} else {
par = String.format("%s=%s", key, quoteString(val.toString())); // NOI18N
}
jcmdParams[i++] = par;
}
return jcmdParams;
}
private static String quoteString(String val) {
if (val.indexOf(' ')>=0) {
return "\""+val+"\""; //NOI18N
}
return val;
}
}
``` |
Villavaliente is a municipality in Albacete, Castile-La Mancha, Spain. It has a population of 279.
Gallery
References
Municipalities of the Province of Albacete |
Edward William Burke (3 July 1847 – 10 November 1915) was a priest, president of Carlow College, and founder of St. Joseph's Academy, was born in Clane, County Kildare, Ireland. He attended Carlow College and shortly after, he went Maynooth College, where he became an ordained priest. After one year in Dunboyne Institute, he subsequently became a professor, vice-president and president of Carlow College, and parish priest of Bagenalstown, County Carlow. He also established St. Joseph's Academy of Bagenalstown.
Burke was born in Hodgestown, Clane, Co. Kildare, on 3 July 1847. He was educated at Carlow College, and at Maynooth before being ordained for the priesthood in Maynooth in 1869.
Following a year in the Dunboyne Institute in Maynooth, Burke became a professor in Carlow College], served as vice-president of the college from 1874 and as College president from 1880 until 1892. In 1890 he was appointed parish priest of Bagenalstown, Co. Carlow.
During his presidency of Carlow College the College Chapel was built, also the transfer of the Lay College to Knockbeg occurred during his tenure in Carlow College.
While in Bagenalstown, Burke established St. Joseph’s Academy and invited the De La Salle Brothers to staff both it and St. Brigid’s National School.
He died on 10 November 1915.
See also
Catholic Church in Ireland
References
Alumni of Carlow College
Alumni of St Patrick's College, Maynooth
Academics of St. Patrick's, Carlow College
1847 births
1915 deaths
20th-century Irish Roman Catholic priests
19th-century Irish Roman Catholic priests
Christian clergy from County Kildare
People from Clane |
```forth
*> \brief \b SLATPS solves a triangular system of equations with the matrix held in packed storage.
*
* =========== DOCUMENTATION ===========
*
* Online html documentation available at
* path_to_url
*
*> \htmlonly
*> Download SLATPS + dependencies
*> <a href="path_to_url">
*> [TGZ]</a>
*> <a href="path_to_url">
*> [ZIP]</a>
*> <a href="path_to_url">
*> [TXT]</a>
*> \endhtmlonly
*
* Definition:
* ===========
*
* SUBROUTINE SLATPS( UPLO, TRANS, DIAG, NORMIN, N, AP, X, SCALE,
* CNORM, INFO )
*
* .. Scalar Arguments ..
* CHARACTER DIAG, NORMIN, TRANS, UPLO
* INTEGER INFO, N
* REAL SCALE
* ..
* .. Array Arguments ..
* REAL AP( * ), CNORM( * ), X( * )
* ..
*
*
*> \par Purpose:
* =============
*>
*> \verbatim
*>
*> SLATPS solves one of the triangular systems
*>
*> A *x = s*b or A**T*x = s*b
*>
*> with scaling to prevent overflow, where A is an upper or lower
*> triangular matrix stored in packed form. Here A**T denotes the
*> transpose of A, x and b are n-element vectors, and s is a scaling
*> factor, usually less than or equal to 1, chosen so that the
*> components of x will be less than the overflow threshold. If the
*> unscaled problem will not cause overflow, the Level 2 BLAS routine
*> STPSV is called. If the matrix A is singular (A(j,j) = 0 for some j),
*> then s is set to 0 and a non-trivial solution to A*x = 0 is returned.
*> \endverbatim
*
* Arguments:
* ==========
*
*> \param[in] UPLO
*> \verbatim
*> UPLO is CHARACTER*1
*> Specifies whether the matrix A is upper or lower triangular.
*> = 'U': Upper triangular
*> = 'L': Lower triangular
*> \endverbatim
*>
*> \param[in] TRANS
*> \verbatim
*> TRANS is CHARACTER*1
*> Specifies the operation applied to A.
*> = 'N': Solve A * x = s*b (No transpose)
*> = 'T': Solve A**T* x = s*b (Transpose)
*> = 'C': Solve A**T* x = s*b (Conjugate transpose = Transpose)
*> \endverbatim
*>
*> \param[in] DIAG
*> \verbatim
*> DIAG is CHARACTER*1
*> Specifies whether or not the matrix A is unit triangular.
*> = 'N': Non-unit triangular
*> = 'U': Unit triangular
*> \endverbatim
*>
*> \param[in] NORMIN
*> \verbatim
*> NORMIN is CHARACTER*1
*> Specifies whether CNORM has been set or not.
*> = 'Y': CNORM contains the column norms on entry
*> = 'N': CNORM is not set on entry. On exit, the norms will
*> be computed and stored in CNORM.
*> \endverbatim
*>
*> \param[in] N
*> \verbatim
*> N is INTEGER
*> The order of the matrix A. N >= 0.
*> \endverbatim
*>
*> \param[in] AP
*> \verbatim
*> AP is REAL array, dimension (N*(N+1)/2)
*> The upper or lower triangular matrix A, packed columnwise in
*> a linear array. The j-th column of A is stored in the array
*> AP as follows:
*> if UPLO = 'U', AP(i + (j-1)*j/2) = A(i,j) for 1<=i<=j;
*> if UPLO = 'L', AP(i + (j-1)*(2n-j)/2) = A(i,j) for j<=i<=n.
*> \endverbatim
*>
*> \param[in,out] X
*> \verbatim
*> X is REAL array, dimension (N)
*> On entry, the right hand side b of the triangular system.
*> On exit, X is overwritten by the solution vector x.
*> \endverbatim
*>
*> \param[out] SCALE
*> \verbatim
*> SCALE is REAL
*> The scaling factor s for the triangular system
*> A * x = s*b or A**T* x = s*b.
*> If SCALE = 0, the matrix A is singular or badly scaled, and
*> the vector x is an exact or approximate solution to A*x = 0.
*> \endverbatim
*>
*> \param[in,out] CNORM
*> \verbatim
*> CNORM is REAL array, dimension (N)
*>
*> If NORMIN = 'Y', CNORM is an input argument and CNORM(j)
*> contains the norm of the off-diagonal part of the j-th column
*> of A. If TRANS = 'N', CNORM(j) must be greater than or equal
*> to the infinity-norm, and if TRANS = 'T' or 'C', CNORM(j)
*> must be greater than or equal to the 1-norm.
*>
*> If NORMIN = 'N', CNORM is an output argument and CNORM(j)
*> returns the 1-norm of the offdiagonal part of the j-th column
*> of A.
*> \endverbatim
*>
*> \param[out] INFO
*> \verbatim
*> INFO is INTEGER
*> = 0: successful exit
*> < 0: if INFO = -k, the k-th argument had an illegal value
*> \endverbatim
*
* Authors:
* ========
*
*> \author Univ. of Tennessee
*> \author Univ. of California Berkeley
*> \author Univ. of Colorado Denver
*> \author NAG Ltd.
*
*> \ingroup latps
*
*> \par Further Details:
* =====================
*>
*> \verbatim
*>
*> A rough bound on x is computed; if that is less than overflow, STPSV
*> is called, otherwise, specific code is used which checks for possible
*> overflow or divide-by-zero at every operation.
*>
*> A columnwise scheme is used for solving A*x = b. The basic algorithm
*> if A is lower triangular is
*>
*> x[1:n] := b[1:n]
*> for j = 1, ..., n
*> x(j) := x(j) / A(j,j)
*> x[j+1:n] := x[j+1:n] - x(j) * A[j+1:n,j]
*> end
*>
*> Define bounds on the components of x after j iterations of the loop:
*> M(j) = bound on x[1:j]
*> G(j) = bound on x[j+1:n]
*> Initially, let M(0) = 0 and G(0) = max{x(i), i=1,...,n}.
*>
*> Then for iteration j+1 we have
*> M(j+1) <= G(j) / | A(j+1,j+1) |
*> G(j+1) <= G(j) + M(j+1) * | A[j+2:n,j+1] |
*> <= G(j) ( 1 + CNORM(j+1) / | A(j+1,j+1) | )
*>
*> where CNORM(j+1) is greater than or equal to the infinity-norm of
*> column j+1 of A, not counting the diagonal. Hence
*>
*> G(j) <= G(0) product ( 1 + CNORM(i) / | A(i,i) | )
*> 1<=i<=j
*> and
*>
*> |x(j)| <= ( G(0) / |A(j,j)| ) product ( 1 + CNORM(i) / |A(i,i)| )
*> 1<=i< j
*>
*> Since |x(j)| <= M(j), we use the Level 2 BLAS routine STPSV if the
*> reciprocal of the largest M(j), j=1,..,n, is larger than
*> max(underflow, 1/overflow).
*>
*> The bound on x(j) is also used to determine when a step in the
*> columnwise method can be performed without fear of overflow. If
*> the computed bound is greater than a large constant, x is scaled to
*> prevent overflow, but if the bound overflows, x is set to 0, x(j) to
*> 1, and scale to 0, and a non-trivial solution to A*x = 0 is found.
*>
*> Similarly, a row-wise scheme is used to solve A**T*x = b. The basic
*> algorithm for A upper triangular is
*>
*> for j = 1, ..., n
*> x(j) := ( b(j) - A[1:j-1,j]**T * x[1:j-1] ) / A(j,j)
*> end
*>
*> We simultaneously compute two bounds
*> G(j) = bound on ( b(i) - A[1:i-1,i]**T * x[1:i-1] ), 1<=i<=j
*> M(j) = bound on x(i), 1<=i<=j
*>
*> The initial values are G(0) = 0, M(0) = max{b(i), i=1,..,n}, and we
*> add the constraint G(j) >= G(j-1) and M(j) >= M(j-1) for j >= 1.
*> Then the bound on x(j) is
*>
*> M(j) <= M(j-1) * ( 1 + CNORM(j) ) / | A(j,j) |
*>
*> <= M(0) * product ( ( 1 + CNORM(i) ) / |A(i,i)| )
*> 1<=i<=j
*>
*> and we can safely call STPSV if 1/M(n) and 1/G(n) are both greater
*> than max(underflow, 1/overflow).
*> \endverbatim
*>
* =====================================================================
SUBROUTINE SLATPS( UPLO, TRANS, DIAG, NORMIN, N, AP, X, SCALE,
$ CNORM, INFO )
*
* -- LAPACK auxiliary routine --
* -- LAPACK is a software package provided by Univ. of Tennessee, --
* -- Univ. of California Berkeley, Univ. of Colorado Denver and NAG Ltd..--
*
* .. Scalar Arguments ..
CHARACTER DIAG, NORMIN, TRANS, UPLO
INTEGER INFO, N
REAL SCALE
* ..
* .. Array Arguments ..
REAL AP( * ), CNORM( * ), X( * )
* ..
*
* =====================================================================
*
* .. Parameters ..
REAL ZERO, HALF, ONE
PARAMETER ( ZERO = 0.0E+0, HALF = 0.5E+0, ONE = 1.0E+0 )
* ..
* .. Local Scalars ..
LOGICAL NOTRAN, NOUNIT, UPPER
INTEGER I, IMAX, IP, J, JFIRST, JINC, JLAST, JLEN
REAL BIGNUM, GROW, REC, SMLNUM, SUMJ, TJJ, TJJS,
$ TMAX, TSCAL, USCAL, XBND, XJ, XMAX
* ..
* .. External Functions ..
LOGICAL LSAME
INTEGER ISAMAX
REAL SASUM, SDOT, SLAMCH
EXTERNAL LSAME, ISAMAX, SASUM, SDOT, SLAMCH
* ..
* .. External Subroutines ..
EXTERNAL SAXPY, SSCAL, STPSV, XERBLA
* ..
* .. Intrinsic Functions ..
INTRINSIC ABS, MAX, MIN
* ..
* .. Executable Statements ..
*
INFO = 0
UPPER = LSAME( UPLO, 'U' )
NOTRAN = LSAME( TRANS, 'N' )
NOUNIT = LSAME( DIAG, 'N' )
*
* Test the input parameters.
*
IF( .NOT.UPPER .AND. .NOT.LSAME( UPLO, 'L' ) ) THEN
INFO = -1
ELSE IF( .NOT.NOTRAN .AND. .NOT.LSAME( TRANS, 'T' ) .AND. .NOT.
$ LSAME( TRANS, 'C' ) ) THEN
INFO = -2
ELSE IF( .NOT.NOUNIT .AND. .NOT.LSAME( DIAG, 'U' ) ) THEN
INFO = -3
ELSE IF( .NOT.LSAME( NORMIN, 'Y' ) .AND. .NOT.
$ LSAME( NORMIN, 'N' ) ) THEN
INFO = -4
ELSE IF( N.LT.0 ) THEN
INFO = -5
END IF
IF( INFO.NE.0 ) THEN
CALL XERBLA( 'SLATPS', -INFO )
RETURN
END IF
*
* Quick return if possible
*
IF( N.EQ.0 )
$ RETURN
*
* Determine machine dependent parameters to control overflow.
*
SMLNUM = SLAMCH( 'Safe minimum' ) / SLAMCH( 'Precision' )
BIGNUM = ONE / SMLNUM
SCALE = ONE
*
IF( LSAME( NORMIN, 'N' ) ) THEN
*
* Compute the 1-norm of each column, not including the diagonal.
*
IF( UPPER ) THEN
*
* A is upper triangular.
*
IP = 1
DO 10 J = 1, N
CNORM( J ) = SASUM( J-1, AP( IP ), 1 )
IP = IP + J
10 CONTINUE
ELSE
*
* A is lower triangular.
*
IP = 1
DO 20 J = 1, N - 1
CNORM( J ) = SASUM( N-J, AP( IP+1 ), 1 )
IP = IP + N - J + 1
20 CONTINUE
CNORM( N ) = ZERO
END IF
END IF
*
* Scale the column norms by TSCAL if the maximum element in CNORM is
* greater than BIGNUM.
*
IMAX = ISAMAX( N, CNORM, 1 )
TMAX = CNORM( IMAX )
IF( TMAX.LE.BIGNUM ) THEN
TSCAL = ONE
ELSE
TSCAL = ONE / ( SMLNUM*TMAX )
CALL SSCAL( N, TSCAL, CNORM, 1 )
END IF
*
* Compute a bound on the computed solution vector to see if the
* Level 2 BLAS routine STPSV can be used.
*
J = ISAMAX( N, X, 1 )
XMAX = ABS( X( J ) )
XBND = XMAX
IF( NOTRAN ) THEN
*
* Compute the growth in A * x = b.
*
IF( UPPER ) THEN
JFIRST = N
JLAST = 1
JINC = -1
ELSE
JFIRST = 1
JLAST = N
JINC = 1
END IF
*
IF( TSCAL.NE.ONE ) THEN
GROW = ZERO
GO TO 50
END IF
*
IF( NOUNIT ) THEN
*
* A is non-unit triangular.
*
* Compute GROW = 1/G(j) and XBND = 1/M(j).
* Initially, G(0) = max{x(i), i=1,...,n}.
*
GROW = ONE / MAX( XBND, SMLNUM )
XBND = GROW
IP = JFIRST*( JFIRST+1 ) / 2
JLEN = N
DO 30 J = JFIRST, JLAST, JINC
*
* Exit the loop if the growth factor is too small.
*
IF( GROW.LE.SMLNUM )
$ GO TO 50
*
* M(j) = G(j-1) / abs(A(j,j))
*
TJJ = ABS( AP( IP ) )
XBND = MIN( XBND, MIN( ONE, TJJ )*GROW )
IF( TJJ+CNORM( J ).GE.SMLNUM ) THEN
*
* G(j) = G(j-1)*( 1 + CNORM(j) / abs(A(j,j)) )
*
GROW = GROW*( TJJ / ( TJJ+CNORM( J ) ) )
ELSE
*
* G(j) could overflow, set GROW to 0.
*
GROW = ZERO
END IF
IP = IP + JINC*JLEN
JLEN = JLEN - 1
30 CONTINUE
GROW = XBND
ELSE
*
* A is unit triangular.
*
* Compute GROW = 1/G(j), where G(0) = max{x(i), i=1,...,n}.
*
GROW = MIN( ONE, ONE / MAX( XBND, SMLNUM ) )
DO 40 J = JFIRST, JLAST, JINC
*
* Exit the loop if the growth factor is too small.
*
IF( GROW.LE.SMLNUM )
$ GO TO 50
*
* G(j) = G(j-1)*( 1 + CNORM(j) )
*
GROW = GROW*( ONE / ( ONE+CNORM( J ) ) )
40 CONTINUE
END IF
50 CONTINUE
*
ELSE
*
* Compute the growth in A**T * x = b.
*
IF( UPPER ) THEN
JFIRST = 1
JLAST = N
JINC = 1
ELSE
JFIRST = N
JLAST = 1
JINC = -1
END IF
*
IF( TSCAL.NE.ONE ) THEN
GROW = ZERO
GO TO 80
END IF
*
IF( NOUNIT ) THEN
*
* A is non-unit triangular.
*
* Compute GROW = 1/G(j) and XBND = 1/M(j).
* Initially, M(0) = max{x(i), i=1,...,n}.
*
GROW = ONE / MAX( XBND, SMLNUM )
XBND = GROW
IP = JFIRST*( JFIRST+1 ) / 2
JLEN = 1
DO 60 J = JFIRST, JLAST, JINC
*
* Exit the loop if the growth factor is too small.
*
IF( GROW.LE.SMLNUM )
$ GO TO 80
*
* G(j) = max( G(j-1), M(j-1)*( 1 + CNORM(j) ) )
*
XJ = ONE + CNORM( J )
GROW = MIN( GROW, XBND / XJ )
*
* M(j) = M(j-1)*( 1 + CNORM(j) ) / abs(A(j,j))
*
TJJ = ABS( AP( IP ) )
IF( XJ.GT.TJJ )
$ XBND = XBND*( TJJ / XJ )
JLEN = JLEN + 1
IP = IP + JINC*JLEN
60 CONTINUE
GROW = MIN( GROW, XBND )
ELSE
*
* A is unit triangular.
*
* Compute GROW = 1/G(j), where G(0) = max{x(i), i=1,...,n}.
*
GROW = MIN( ONE, ONE / MAX( XBND, SMLNUM ) )
DO 70 J = JFIRST, JLAST, JINC
*
* Exit the loop if the growth factor is too small.
*
IF( GROW.LE.SMLNUM )
$ GO TO 80
*
* G(j) = ( 1 + CNORM(j) )*G(j-1)
*
XJ = ONE + CNORM( J )
GROW = GROW / XJ
70 CONTINUE
END IF
80 CONTINUE
END IF
*
IF( ( GROW*TSCAL ).GT.SMLNUM ) THEN
*
* Use the Level 2 BLAS solve if the reciprocal of the bound on
* elements of X is not too small.
*
CALL STPSV( UPLO, TRANS, DIAG, N, AP, X, 1 )
ELSE
*
* Use a Level 1 BLAS solve, scaling intermediate results.
*
IF( XMAX.GT.BIGNUM ) THEN
*
* Scale X so that its components are less than or equal to
* BIGNUM in absolute value.
*
SCALE = BIGNUM / XMAX
CALL SSCAL( N, SCALE, X, 1 )
XMAX = BIGNUM
END IF
*
IF( NOTRAN ) THEN
*
* Solve A * x = b
*
IP = JFIRST*( JFIRST+1 ) / 2
DO 100 J = JFIRST, JLAST, JINC
*
* Compute x(j) = b(j) / A(j,j), scaling x if necessary.
*
XJ = ABS( X( J ) )
IF( NOUNIT ) THEN
TJJS = AP( IP )*TSCAL
ELSE
TJJS = TSCAL
IF( TSCAL.EQ.ONE )
$ GO TO 95
END IF
TJJ = ABS( TJJS )
IF( TJJ.GT.SMLNUM ) THEN
*
* abs(A(j,j)) > SMLNUM:
*
IF( TJJ.LT.ONE ) THEN
IF( XJ.GT.TJJ*BIGNUM ) THEN
*
* Scale x by 1/b(j).
*
REC = ONE / XJ
CALL SSCAL( N, REC, X, 1 )
SCALE = SCALE*REC
XMAX = XMAX*REC
END IF
END IF
X( J ) = X( J ) / TJJS
XJ = ABS( X( J ) )
ELSE IF( TJJ.GT.ZERO ) THEN
*
* 0 < abs(A(j,j)) <= SMLNUM:
*
IF( XJ.GT.TJJ*BIGNUM ) THEN
*
* Scale x by (1/abs(x(j)))*abs(A(j,j))*BIGNUM
* to avoid overflow when dividing by A(j,j).
*
REC = ( TJJ*BIGNUM ) / XJ
IF( CNORM( J ).GT.ONE ) THEN
*
* Scale by 1/CNORM(j) to avoid overflow when
* multiplying x(j) times column j.
*
REC = REC / CNORM( J )
END IF
CALL SSCAL( N, REC, X, 1 )
SCALE = SCALE*REC
XMAX = XMAX*REC
END IF
X( J ) = X( J ) / TJJS
XJ = ABS( X( J ) )
ELSE
*
* A(j,j) = 0: Set x(1:n) = 0, x(j) = 1, and
* scale = 0, and compute a solution to A*x = 0.
*
DO 90 I = 1, N
X( I ) = ZERO
90 CONTINUE
X( J ) = ONE
XJ = ONE
SCALE = ZERO
XMAX = ZERO
END IF
95 CONTINUE
*
* Scale x if necessary to avoid overflow when adding a
* multiple of column j of A.
*
IF( XJ.GT.ONE ) THEN
REC = ONE / XJ
IF( CNORM( J ).GT.( BIGNUM-XMAX )*REC ) THEN
*
* Scale x by 1/(2*abs(x(j))).
*
REC = REC*HALF
CALL SSCAL( N, REC, X, 1 )
SCALE = SCALE*REC
END IF
ELSE IF( XJ*CNORM( J ).GT.( BIGNUM-XMAX ) ) THEN
*
* Scale x by 1/2.
*
CALL SSCAL( N, HALF, X, 1 )
SCALE = SCALE*HALF
END IF
*
IF( UPPER ) THEN
IF( J.GT.1 ) THEN
*
* Compute the update
* x(1:j-1) := x(1:j-1) - x(j) * A(1:j-1,j)
*
CALL SAXPY( J-1, -X( J )*TSCAL, AP( IP-J+1 ), 1,
$ X,
$ 1 )
I = ISAMAX( J-1, X, 1 )
XMAX = ABS( X( I ) )
END IF
IP = IP - J
ELSE
IF( J.LT.N ) THEN
*
* Compute the update
* x(j+1:n) := x(j+1:n) - x(j) * A(j+1:n,j)
*
CALL SAXPY( N-J, -X( J )*TSCAL, AP( IP+1 ), 1,
$ X( J+1 ), 1 )
I = J + ISAMAX( N-J, X( J+1 ), 1 )
XMAX = ABS( X( I ) )
END IF
IP = IP + N - J + 1
END IF
100 CONTINUE
*
ELSE
*
* Solve A**T * x = b
*
IP = JFIRST*( JFIRST+1 ) / 2
JLEN = 1
DO 140 J = JFIRST, JLAST, JINC
*
* Compute x(j) = b(j) - sum A(k,j)*x(k).
* k<>j
*
XJ = ABS( X( J ) )
USCAL = TSCAL
REC = ONE / MAX( XMAX, ONE )
IF( CNORM( J ).GT.( BIGNUM-XJ )*REC ) THEN
*
* If x(j) could overflow, scale x by 1/(2*XMAX).
*
REC = REC*HALF
IF( NOUNIT ) THEN
TJJS = AP( IP )*TSCAL
ELSE
TJJS = TSCAL
END IF
TJJ = ABS( TJJS )
IF( TJJ.GT.ONE ) THEN
*
* Divide by A(j,j) when scaling x if A(j,j) > 1.
*
REC = MIN( ONE, REC*TJJ )
USCAL = USCAL / TJJS
END IF
IF( REC.LT.ONE ) THEN
CALL SSCAL( N, REC, X, 1 )
SCALE = SCALE*REC
XMAX = XMAX*REC
END IF
END IF
*
SUMJ = ZERO
IF( USCAL.EQ.ONE ) THEN
*
* If the scaling needed for A in the dot product is 1,
* call SDOT to perform the dot product.
*
IF( UPPER ) THEN
SUMJ = SDOT( J-1, AP( IP-J+1 ), 1, X, 1 )
ELSE IF( J.LT.N ) THEN
SUMJ = SDOT( N-J, AP( IP+1 ), 1, X( J+1 ), 1 )
END IF
ELSE
*
* Otherwise, use in-line code for the dot product.
*
IF( UPPER ) THEN
DO 110 I = 1, J - 1
SUMJ = SUMJ + ( AP( IP-J+I )*USCAL )*X( I )
110 CONTINUE
ELSE IF( J.LT.N ) THEN
DO 120 I = 1, N - J
SUMJ = SUMJ + ( AP( IP+I )*USCAL )*X( J+I )
120 CONTINUE
END IF
END IF
*
IF( USCAL.EQ.TSCAL ) THEN
*
* Compute x(j) := ( x(j) - sumj ) / A(j,j) if 1/A(j,j)
* was not used to scale the dotproduct.
*
X( J ) = X( J ) - SUMJ
XJ = ABS( X( J ) )
IF( NOUNIT ) THEN
*
* Compute x(j) = x(j) / A(j,j), scaling if necessary.
*
TJJS = AP( IP )*TSCAL
ELSE
TJJS = TSCAL
IF( TSCAL.EQ.ONE )
$ GO TO 135
END IF
TJJ = ABS( TJJS )
IF( TJJ.GT.SMLNUM ) THEN
*
* abs(A(j,j)) > SMLNUM:
*
IF( TJJ.LT.ONE ) THEN
IF( XJ.GT.TJJ*BIGNUM ) THEN
*
* Scale X by 1/abs(x(j)).
*
REC = ONE / XJ
CALL SSCAL( N, REC, X, 1 )
SCALE = SCALE*REC
XMAX = XMAX*REC
END IF
END IF
X( J ) = X( J ) / TJJS
ELSE IF( TJJ.GT.ZERO ) THEN
*
* 0 < abs(A(j,j)) <= SMLNUM:
*
IF( XJ.GT.TJJ*BIGNUM ) THEN
*
* Scale x by (1/abs(x(j)))*abs(A(j,j))*BIGNUM.
*
REC = ( TJJ*BIGNUM ) / XJ
CALL SSCAL( N, REC, X, 1 )
SCALE = SCALE*REC
XMAX = XMAX*REC
END IF
X( J ) = X( J ) / TJJS
ELSE
*
* A(j,j) = 0: Set x(1:n) = 0, x(j) = 1, and
* scale = 0, and compute a solution to A**T*x = 0.
*
DO 130 I = 1, N
X( I ) = ZERO
130 CONTINUE
X( J ) = ONE
SCALE = ZERO
XMAX = ZERO
END IF
135 CONTINUE
ELSE
*
* Compute x(j) := x(j) / A(j,j) - sumj if the dot
* product has already been divided by 1/A(j,j).
*
X( J ) = X( J ) / TJJS - SUMJ
END IF
XMAX = MAX( XMAX, ABS( X( J ) ) )
JLEN = JLEN + 1
IP = IP + JINC*JLEN
140 CONTINUE
END IF
SCALE = SCALE / TSCAL
END IF
*
* Scale the column norms by 1/TSCAL for return.
*
IF( TSCAL.NE.ONE ) THEN
CALL SSCAL( N, ONE / TSCAL, CNORM, 1 )
END IF
*
RETURN
*
* End of SLATPS
*
END
``` |
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var resolve = require( 'path' ).resolve;
var bench = require( '@stdlib/bench' );
var uniform = require( '@stdlib/random/array/uniform' );
var isnanf = require( '@stdlib/math/base/assert/is-nanf' );
var pow = require( '@stdlib/math/base/special/pow' );
var Complex64Array = require( '@stdlib/array/complex64' );
var reinterpret = require( '@stdlib/strided/base/reinterpret-complex64' );
var tryRequire = require( '@stdlib/utils/try-require' );
var pkg = require( './../package.json' ).name;
// VARIABLES //
var csrot = tryRequire( resolve( __dirname, './../lib/csrot.native.js' ) );
var opts = {
'skip': ( csrot instanceof Error )
};
var options = {
'dtype': 'float32'
};
// FUNCTIONS //
/**
* Creates a benchmark function.
*
* @private
* @param {PositiveInteger} len - array length
* @returns {Function} benchmark function
*/
function createBenchmark( len ) {
var cx;
var cy;
cx = new Complex64Array( uniform( len*2, -100.0, 100.0, options ) );
cy = new Complex64Array( uniform( len*2, -100.0, 100.0, options ) );
return benchmark;
/**
* Benchmark function.
*
* @private
* @param {Benchmark} b - benchmark instance
*/
function benchmark( b ) {
var viewX;
var i;
viewX = reinterpret( cx, 0 );
b.tic();
for ( i = 0; i < b.iterations; i++ ) {
csrot( cx.length, cx, 1, cy, 1, 0.8, 0.6 );
if ( isnanf( viewX[ i%(len*2) ] ) ) {
b.fail( 'should not return NaN' );
}
}
b.toc();
if ( isnanf( viewX[ i%(len*2) ] ) ) {
b.fail( 'should not return NaN' );
}
b.pass( 'benchmark finished' );
b.end();
}
}
// MAIN //
/**
* Main execution sequence.
*
* @private
*/
function main() {
var len;
var min;
var max;
var f;
var i;
min = 1; // 10^min
max = 6; // 10^max
for ( i = min; i <= max; i++ ) {
len = pow( 10, i );
f = createBenchmark( len );
bench( pkg+'::native:len='+len, opts, f );
}
}
main();
``` |
```java
/**
* This file is part of Skript.
*
* Skript is free software: you can redistribute it and/or modify
* (at your option) any later version.
*
* Skript is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
*
* along with Skript. If not, see <path_to_url
*
*/
package ch.njol.skript.classes;
import java.io.NotSerializableException;
import java.io.StreamCorruptedException;
import org.bukkit.configuration.InvalidConfigurationException;
import org.bukkit.configuration.file.YamlConfiguration;
import org.bukkit.configuration.serialization.ConfigurationSerializable;
import org.eclipse.jdt.annotation.Nullable;
import ch.njol.yggdrasil.Fields;
/**
* Uses strings for serialisation because the whole ConfigurationSerializable interface is badly documented, and especially DelegateDeserialization doesn't work well with
* Yggdrasil.
*
* @author Peter Gttinger
*/
public class ConfigurationSerializer<T extends ConfigurationSerializable> extends Serializer<T> {
@Override
public Fields serialize(final T o) throws NotSerializableException {
final Fields f = new Fields();
f.putObject("value", serializeCS(o));
return f;
}
@Override
public boolean mustSyncDeserialization() {
return false;
}
@Override
public boolean canBeInstantiated() {
return false;
}
@Override
protected T deserialize(final Fields fields) throws StreamCorruptedException {
final String val = fields.getObject("value", String.class);
if (val == null)
throw new StreamCorruptedException();
final ClassInfo<? extends T> info = this.info;
assert info != null;
final T t = deserializeCS(val, info.getC());
if (t == null)
throw new StreamCorruptedException();
return t;
}
public static String serializeCS(final ConfigurationSerializable o) {
final YamlConfiguration y = new YamlConfiguration();
y.set("value", o);
return "" + y.saveToString();
}
@SuppressWarnings("unchecked")
@Nullable
public static <T extends ConfigurationSerializable> T deserializeCS(final String s, final Class<T> c) {
final YamlConfiguration y = new YamlConfiguration();
try {
y.loadFromString(s);
} catch (final InvalidConfigurationException e) {
return null;
}
final Object o = y.get("value");
if (!c.isInstance(o))
return null;
return (T) o;
}
@Override
@Nullable
public <E extends T> E newInstance(final Class<E> c) {
assert false;
return null;
}
@Override
public void deserialize(final T o, final Fields fields) throws StreamCorruptedException {
assert false;
}
@Override
@Deprecated
@Nullable
public T deserialize(final String s) {
final ClassInfo<? extends T> info = this.info;
assert info != null;
return deserializeCSOld(s, info.getC());
}
@SuppressWarnings("unchecked")
@Deprecated
@Nullable
public static <T extends ConfigurationSerializable> T deserializeCSOld(final String s, final Class<T> c) {
final YamlConfiguration y = new YamlConfiguration();
try {
y.loadFromString(s.replace("\uFEFF", "\n"));
} catch (final InvalidConfigurationException e) {
return null;
}
final Object o = y.get("value");
if (!c.isInstance(o))
return null;
return (T) o;
}
}
``` |
```java
/*
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.apache.shardingsphere.sql.parser.statement.opengauss.ddl;
import org.apache.shardingsphere.sql.parser.statement.core.statement.ddl.FetchStatement;
import org.apache.shardingsphere.sql.parser.statement.opengauss.OpenGaussStatement;
/**
* OpenGauss fetch statement.
*/
public final class OpenGaussFetchStatement extends FetchStatement implements OpenGaussStatement {
}
``` |
The Verification of the Origins of Rotation in Tornadoes Experiment (or VORTEX) are field experiments that study tornadoes. VORTEX1 was the first time scientists completely researched the entire evolution of a tornado with an array of instrumentation, enabling a greater understanding of the processes involved with tornadogenesis. A violent tornado near Union City, Oklahoma was documented in its entirety by chasers of the Tornado Intercept Project (TIP) in 1973. Their visual observations led to advancement in understanding of tornado structure and life cycles.
VORTEX2 used enhanced technology that allowed scientists to improve forecasting capabilities and improve lead time on advanced warnings to residents. VORTEX2 sought to reveal how tornadoes form, how long they last and why they last that long, and what causes them to dissipate.
VORTEX1 and VORTEX2 was based on the use of large fleets of instrumented vehicles that ran on land, as well as aircraft and mobile radars. Important work on developing and coordinating mobile mesonets came from these field projects. Analysis of data collected in subsequent years led to significant advancement in understanding of supercell and tornado morphology and dynamics. The field research phase of the VORTEX2 project concluded on July 6, 2010.
VORTEX1
The VORTEX1 project sought to understand how a tornado is produced by deploying tornado experts in around 18 vehicles that were equipped with customized instruments used to measure and analyze the weather around a tornado. As noted aircraft and radar resources were also deployed for such measurements. The project directors were also interested in why some supercells, or mesocyclones within such storms, produce tornadoes while others do not. It also sought to determine why some supercells form violent tornadoes versus weak tornadoes.
The original project took place in 1994 and 1995. Several smaller studies, such as SUB-VORTEX and VORTEX-99, were conducted from 1996 to 2008. VORTEX1 documented the entire life cycle of a tornado, for the first time measuring it by significant instrumentation for the entire event. . Severe weather warnings improved after the research collected from VORTEX1, and many believe that VORTEX1 contributed to this improvement.
“An important finding from the original VORTEX experiment was that the factors responsible for causing tornadoes happen on smaller time and space scales than scientists had thought. New advances will allow for a more detailed sampling of a storm's wind, temperature, and moisture environment, and lead to a better understanding of why tornadoes form –-and how they can be more accurately predicted,” said Stephan Nelson, NSF program director for physical and dynamic meteorology.
VORTEX had the capability to fly Doppler weather radar above the tornado approximately every five minutes.
VORTEX research helped the National Weather Service (NWS) to provide tornado warnings to residents with a lead time of 13 minutes. A federal research meteorologist, Don Burgess, estimates that the "false alarms" pertaining to severe weather by the National Weather Service have declined by 10 percent.
The movie Twister was at least partially inspired by the VORTEX project.
VORTEX2
VORTEX2 was an expanded second VORTEX project, with field phases from 10 May until 13 June 2009 and 1 May until 15 June 2010. VORTEX2's goals were studying why some thunderstorms produce tornadoes while others do not, and learning about tornado structure, in order to make more accurate tornado forecasts and warnings with longer lead time. VORTEX2 was by far the largest and most ambitious tornado study ever with over 100 scientific participants from many different universities and research laboratories.
"We still do not completely understand the processes that lead to tornado formation and shape its development. We hope that VORTEX2 will provide the data we need to learn more about the development of tornadoes and in time help forecasters give people more advance warning before a tornado strikes," said Roger Wakimoto, director of the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR) and a principal investigator for VORTEX2.
"Then you can get first responders to be better prepared—police, fire, medical personnel, even power companies. Now, that's not even remotely possible," said Stephan P. Nelson, a program director in the atmospheric sciences division of the National Science Foundation (NSF).
Joshua Wurman, president of the Center for Severe Weather Research (CSWR) in Boulder, Colorado proposes, "if we can increase that lead time from 13 minutes to half an hour, then the average person at home could do something different. Maybe they can seek a community shelter instead of just going into their bathtub. Maybe they can get their family to better safety if we can give them a longer warning and a more precise warning."
VORTEX2 deployed 50 vehicles customized with mobile radar, including the Doppler On Wheels (DOW) radars, SMART radars, the NOXP radar, a fleet of instrumented vehicles, unmanned aerial vehicles (UAVs), deployable instrument arrays called Sticknet and Podnet, and mobile weather balloon launching equipment. More than 100 scientists and crew researched tornadoes and supercell thunderstorms in the "Tornado Alley" region of the United States' Great Plains between Texas and Minnesota. A number of institutions and countries were involved in the US$11.9 million project, including: the US National Oceanic and Atmospheric Administration (NOAA) and its National Weather Service and the Storm Prediction Center (SPC) therein, the Australian Bureau of Meteorology (BOM), Finland, Italy, the Netherlands, the United Kingdom, Environment Canada, and universities across the United States and elsewhere.
The project included DOW3, DOW6, DOW7, Rapid-Scan DOW, SMART-RADARs, NOXP, UMASS-X, UMASS-W, CIRPAS and TIV 2 for their mobile radar contingent. The Doppler on Wheels were supplied by the Center for Severe Weather Research, and the SMART-Radars from the University of Oklahoma (OU). The National Severe Storms Laboratory (NSSL) supplied the NOXP radar, as well as several other radar units from the University of Massachusetts Amherst, the Office of Naval Research (ONR), and Texas Tech University (TTU). NSSL and CSWR supplied mobile mesonet fleets. Mobile radiosonde launching vehicles were provided by NSSL, NCAR, and the State University of New York at Oswego (SUNY Oswego). There were quite a few other deployable state-of-the-art instrumentation, such as Sticknets from TTU, tornado PODS from CSWR, and four disdrometers from University of Colorado CU, and the University of Illinois at Urbana-Champaign (UIUC).
VORTEX2 technology allowed trucks with radar to be placed in and near tornadic storms and allowed continuous observations of the tornadic activity. Howard Bluestein, a meteorology professor at the University of Oklahoma said, "We will be able to distinguish between rain, hail, dust, debris, flying cows."
Additionally, photogrammetry teams, damage survey teams, unmanned aircraft, and weather balloon launching vans helped to surround the tornadoes and thunderstorms. The equipment amassed enabled three-dimensional data sets of the storms to be collected with radars and other instruments every 75 seconds (more frequently for some individual instruments), and resolution of the tornado and tornadic storm cells as close as .
Scientists met May 10 and held a class to teach the crews how to launch the tornado pods, which would have to be released within 45 seconds of notification. VORTEX2 was equipped with 12 tornado PODS, instruments mounted onto towers that measure wind velocity (i.e. speed and direction). The aim was that some of the measurements would be taken in the center of the tornado. Once the pods are deployed, the teams repeat the process at the next location until finally the teams return to the south of the tornado to retrieve the pods with the recorded data. The process is repeated. This takes place within , or 4 minutes away from the tornado itself.
The team had 24 high portable Sticknets, which can be set up at various locations around tornado storm cells to measure wind fields, provide atmospheric readings, and record acoustically the hail and precipitation.
Scientists are still seeking to refine understanding of which supercell thunderstorms that form mesocyclones will eventually produce tornadoes, and by which processes, storm-scale interactions, and within which atmospheric environments.
Updates on the progress of the project were posted on the VORTEX2 home page. The scientists also started a blog of live reports.
"Even though this field phase seems to be the most spectacular and seems like it's a lot of work, by far the majority of what we're doing is when we go back to our labs, when we work with each other, when we work with our students to try to figure out just what is it that we've collected," Wurman said. "It's going to take years to digest this data and to really get the benefit of this."Penn State University featured the public release of the initial scientific findings in the fall.
The forecasters were determining the best probability of sighting a tornado. As the trucks traveled to Clinton, Oklahoma from Childress, Texas, they found mammatus clouds, and lightning at sundown on May 13, 2009.
The project encountered its first tornado on the afternoon of June 5 when they successfully intercepted a tornado in southern Goshen County, Wyoming, which lasted for approximately 25 minutes. One of their vehicles, Probe 1, suffered hail damage during the intercept. Later that evening, embedded Weather Channel (TWC) reporter Mike Bettes reported that elements of VORTEX2 had intercepted a second tornado in Nebraska. Placement of the armada for this tornado was nearly ideal. It was surrounded for its entire life cycle, making it the most thoroughly observed tornado in history.
Partial list of scientists and crew
The complete team comprises about 50 scientists and is supplemented by students. A complete listing of principal investigators (PIs) is at http://vortex2.org/. An alphabetical partial listing of VORTEX2 scientists and crew:
Nolan Atkins, Scientific PI, Professor Lyndon State College.
Michael Biggerstaff, Scientific PI, Professor, University of Oklahoma, expertise is in polarimetric radars, mobile radars, cloud physics and electrification, tropical cyclones (hurricanes), severe local storms, and storm dynamics. He is the Director of the SMART radar program at OU.
Howard Bluestein, Steering Committee, Scientific PI, Professor University of Oklahoma specializes in violent weather phenomena and provides expertise with Doppler weather radar. He is a professor in meteorology.
Donald W. Burgess, Steering Committee, Scientific PI, Scientist at CIMMS.
David Dowell, Steering Committee, Scientific PI, Scientist, National Center for Atmospheric Research.
Jeffrey Frame, Professor University of Illinois at Urbana-Champaign, expert in severe convection.
Katja Friedrich, Scientific PI, Associate Professor, University of Colorado.
Karen Kosiba, Scientific PI, is a senior research meteorologist at the Center for Severe Weather Research.
Timothy P. Marshall, P.E. is a damage analyst with a background in civil/structural engineering and meteorology.
Paul Markowski, Steering Committee, Scientific PI, associate professor in meteorology at Pennsylvania State University, specializes in severe storm dynamics.
Matthew Parker, Scientific PI, Mobile Soundings Coordinator, Associate Professor of meteorology at North Carolina State University. Specializes in the dynamics of convective storms, including tornadic supercells and mesoscale convective systems (MCSs).
Erik N. Rasmussen, Steering Committee, Scientific PI, VORTEX2 co-PI, Atmospheric Scientist and VORTEX1 field director, Rasmussen Systems.
Yvette Richardson, Steering Committee, Scientific PI, associate professor in meteorology at Pennsylvania State University, specializes in severe storm dynamics.
Glen Romine, Scientific PI, Project Scientist, National Center for Atmospheric Research.
Paul Robinson is a senior research meteorologist at the Center for Severe Weather Research.
Roger Wakimoto, Scientific PI, Director National Center for Atmospheric Research.
Chris Weiss, Scientific PI, Associate Professor, Texas Tech University.
Louis Wicker is a research scientist with a specialty in modeling of severe storm dynamics. He was also a co-team leader in VORTEX1. National Severe Storms Laboratory.
Joshua Wurman Steering Committee, Scientific PI, VORTEX2 PI, president at the Center for Severe Weather Research with a specialty in mobile Doppler weather radar, invented and leads the Doppler On Wheels (DOW) program.
Smaller projects
Other smaller field projects include the previously mentioned SUB-VORTEX (1997–98) and VORTEX-99 (1999), and VORTEX-Southeast (VORTEX-SE) (2016-2019).
See also
TOtable Tornado Observatory (TOTO)
TWISTEX
References
External links
– Scientific Program and Experimental Design overviews
NSF press release for VORTEX2
Information on the VORTEX1 project
VORTEX-99
NSSL VORTEX2 profile
Earth Observing Lab project profile (NCAR)
VORTEX1 by David O. Blanchard
Tornado Alley, a documentary featuring VORTEX2 researchers
Severe weather and convection
Meteorology research and field projects
Tornado
Tornadogenesis
Storm chasing |
```python
import numpy
import numpy.random
import copy
import itertools
import dill
import tqdm
def minibatch(train_X, train_y, size=16, nr_update=1000):
with tqdm.tqdm(total=nr_update * size, leave=False) as pbar:
while nr_update >= 0:
indices = numpy.arange(len(train_X))
numpy.random.shuffle(indices)
j = 0
while j < indices.shape[0]:
slice_ = indices[j : j + size]
X = _take_slice(train_X, slice_)
y = _take_slice(train_y, slice_)
yield X, y
j += size
nr_update -= 1
if nr_update <= 0:
break
pbar.update(size)
def _take_slice(data, slice_):
if isinstance(data, list) or isinstance(data, tuple):
return [data[int(i)] for i in slice_]
else:
return data[slice_]
class BestFirstFinder(object):
def __init__(self, **param_values):
self.queue = []
self.limit = 16
self.params = param_values
self.best_acc = 0.0
self.best_i = 0
self.i = 0
self.j = 0
self.best_model = None
self.temperature = 0.0
@property
def configs(self):
keys, value_groups = zip(*self.params.items())
for values in itertools.product(*value_groups):
config = dict(zip(keys, values))
yield config
def enqueue(self, model, train_acc, check_acc):
fom = check_acc * min(check_acc / train_acc, 1.0)
self.queue.append([fom, self.i, 0, model])
if check_acc >= self.best_acc:
self.best_acc = check_acc
self.best_i = self.i
self.best_model = model
self.temperature = 0.0
else:
self.temperature += 0.01
self.j = 0
self.queue.sort(reverse=True)
self.queue = self.queue[:self.limit]
def __iter__(self):
self.queue.sort(reverse=True)
self.queue = self.queue[:self.limit]
for i in range(len(self.queue)):
self.queue[i][0] = self.queue[i][0] - 0.01
self.queue[i][-1][2]['parent'] = self.queue[i][2]
self.queue[i][2] += 1
yield self.queue[i][-1]
@property
def best(self):
return self.best_model
def resample_hyper_params(hparams, temperature):
hparams = dict(hparams)
hparams['epochs'] = hparams.get('epochs', 0) + 1
hparams['learn_rate'] = resample(hparams['learn_rate'], 1e-6, 0.1, temperature)
#hparams['beta1'] = resample(hparams.get('beta1', 0.9), 0.8, 1.0, temperature)
#hparams['beta2'] = resample(hparams.get('beta2', 0.9), 0.8, 1.0, temperature)
#hparams['L2'] = resample(hparams['L2'], 0.0, 1e-3, temperature)
#hparams['batch_size'] = int(resample(hparams['batch_size'], 10, 256, temperature))
#hparams['dropout'] = resample(hparams['dropout'], 0.05, 0.7, temperature)
return hparams
def resample(curr, min_, max_, temperature):
if temperature == 0.0:
return curr
scale = (max_ - min_) * temperature
next_ = numpy.random.normal(loc=curr, scale=scale)
return min(max_, max(min_, next_))
def train_epoch(model, sgd, hparams, train_X, train_y, dev_X, dev_y, device_id=-1,
temperature=0.0):
model, sgd, hparams = dill.loads(dill.dumps((model, sgd, hparams)))
if device_id >= 0:
device = model.to_gpu(device_id)
sgd.ops = model.ops
sgd.to_gpu()
if isinstance(train_y, numpy.ndarray):
train_y = model.ops.asarray(train_y)
dev_y = model.ops.asarray(dev_y)
hparams = resample_hyper_params(hparams, temperature)
sgd.learn_rate = hparams['learn_rate']
sgd.beta1 = hparams['beta1']
sgd.beta2 = hparams['beta2']
sgd.L2 = hparams['L2']
train_acc = 0.
train_n = 0
for X, y in minibatch(train_X, train_y, size=hparams['batch_size'], nr_update=hparams['nr_update']):
yh, finish_update = model.begin_update(X, drop=hparams['dropout'])
if hasattr(y, 'shape'):
dy = (yh-y) / y.shape[0]
train_acc += (y.argmax(axis=1) == yh.argmax(axis=1)).sum()
train_n += y.shape[0]
else:
n_y = sum(len(y_i) for y_i in y)
dy = [(yh[i]-y[i])/n_y for i in range(len(yh))]
for i in range(len(y)):
train_acc += (y[i].argmax(axis=1) == yh[i].argmax(axis=1)).sum()
train_n += n_y
finish_update(dy, sgd=sgd)
train_acc /= train_n
with model.use_params(sgd.averages):
dev_acc = model.evaluate(dev_X, dev_y)
model.to_cpu()
sgd.to_cpu()
return device_id, ((model, sgd, hparams), float(train_acc), float(dev_acc))
class DevicePool(object):
"""Synchronize GPU usage"""
def __init__(self, n):
self.devices = {i: None for i in range(n)}
def acquire(self):
for i, device in self.devices.items():
if device is None:
self.devices[i] = True
return i
else:
return None
def release(self, i):
if i in self.devices:
self.devices[i] = None
#
#def best_first_sgd(initials, train_X, train_y, dev_X, dev_y,
# get_new_model=None, get_score=None):
# if get_new_model is None:
# get_new_model = _get_new_model
# if get_score is None:
# get_score = _get_score
#
# queue = []
# for i, model in enumerate(initials):
# train_acc, model = get_new_model(model, train_X, train_y)
# check_acc = get_score(model, dev_X, dev_y)
# ratio = min(check_acc / train_acc, 1.0)
# print((model[-1], train_acc, check_acc))
# queue.append([check_acc * ratio, i, model])
#
# train_acc = 0
# limit = 8
# i = 0
# best_model = None
# best_acc = 0.0
# best_i = 0
# while best_i > (i - 100) and train_acc < 0.999:
# queue.sort(reverse=True)
# queue = queue[:limit]
# prev_score, parent, model = queue[0]
# queue[0][0] -= 0.001
# yield prev_score, parent, model
# train_acc, new_model = get_new_model(model, train_X, train_y)
# check_acc = get_score(new_model, dev_X, dev_y)
# ratio = min(check_acc / train_acc, 1.0)
#
# i += 1
# queue.append([check_acc * ratio, i, new_model])
#
# if check_acc >= best_acc:
# best_acc = check_acc
# best_i = i
# best_model = new_model
# progress = {
# 'i': i,
# 'parent': parent,
# 'prev_score': prev_score,
# 'this_score': queue[-1][0],
# 'train_acc': train_acc,
# 'check_acc': check_acc,
# 'best_acc': best_acc,
# 'hparams': new_model[-1]
# }
# yield best_model, progress
#
#
#
``` |
Johann Georg Elser (; 4 January 1903 – 9 April 1945) was a German worker who planned and carried out an elaborate assassination attempt on Adolf Hitler and other high-ranking Nazi leaders on 8 November 1939 at the Bürgerbräukeller in Munich (known as the Bürgerbräukeller Bombing). Elser constructed and placed a bomb near the platform from which Hitler was to deliver a speech. It did not kill Hitler, who left earlier than expected, but it did kill 8 people and injured 62 others. Elser was held as a prisoner for more than five years until he was executed at Dachau concentration camp less than a month before the surrender of Nazi Germany.
Background
Family and early life
Georg Elser (the name normally used to refer to him) was born in Hermaringen, Württemberg, to Ludwig Elser and Maria Müller. His parents married one year after his birth, and Maria moved to Königsbronn to live with Ludwig on his smallholding. His father was a timber merchant, while his mother worked on the farm. Georg was often left to care for his five younger siblings: Friederike (born 1904), Maria (born 1906), Ludwig (born 1909), Anna (born 1910) and Leonhard (born 1913). He attended elementary school in Königsbronn from 1910 to 1917 and showed ability in drawing, penmanship and mathematics. His childhood was marred by his father's heavy drinking. Elser recalled in his interrogation by the Gestapo in 1939 how his father habitually came home late from work drunk.
Career and social life
In 1917, Elser worked for half a year assisting in his father's business. Seeking independence, he started an apprenticeship as a lathe operator at the smelter in Königsbronn, but had to quit for health reasons. Between 1919 and 1922, he was apprenticed to master woodworker Robert Sapper in Königsbronn. After topping his class at Heidenheim Trade School, he worked in the furniture factory of Paul Rieder in Aalen. In 1925, he left home to briefly work at the Wachter woodworking company in the small community of Bernried, near Tettnang. Exploring along Lake Constance on foot, he arrived at Friedrichshafen, where he found employment shaping wooden propellers for the fledgling aircraft-manufacturer Dornier.
In August 1925, a work-friend enticed Elser to go with him to Konstanz to work in a clock factory. Due to lack of work, the clock factory closed down, was sold, then reopened as the Schuckmann Clock Factory. Elser was re-employed, but, along with the other employees, he was dismissed when the factory mysteriously burned down after the owner had unsuccessfully tried to sell the failing business. During this period, Elser shared a room with a Communist co-worker who convinced him to join the Red Front-Fighters League. He also joined traditional dress and dance groups (). In 1929, he found work with Schönholzer, a small woodworking company in Bottighofen; this required Elser to cross the border daily into Switzerland. The work ran out within six months, however, and he was let go.
Around this time Elser met a waitress, Mathilde Niedermann. When she became pregnant, he drove her to Geneva in Switzerland. Mathilde was found to be in the fourth month of pregnancy, precluding a legal abortion. The child was born, a boy named Manfred. When Elser left Mathilde, he was left with child-support payments that often surpassed his weekly wage.
In 1930, Elser began commuting daily by ferry from Konstanz to work in the small Rothmund clock factory in Meersburg where he made housings for wall and table clocks. At the Kreuzlingen Free Temperance Union he started a friendship with a seamstress, Hilda Lang. Between May and August 1932, after Rothmund closed down, he lived with several families in Meersburg doing odd carpentry jobs.
In August 1932, Elser returned to Königsbronn after receiving a call for help from his mother. His alcoholic father, often violent and abusive towards her, was now heavily in debt. Elser assisted his parents in their work and supplemented his income by making furniture in a home workshop until his father was forced to sell the family property in late 1935. Elser escaped the grim family situation with music, playing flute, accordion, bass and the zither. He joined the Zither Club in Königsbronn in early 1933.
At around this time, Elser joined a hiking club where he met Elsa Härlen. He moved to lodge in the Härlens' basement, building kitchen cabinets, kitchen chairs and a doll's house for Elsa. Their love affair in the spring of 1936 led to her separation from her husband in 1937 and divorce in 1938.
In 1936, Elser worked with a carpenter named Grupp in Königsbronn, making desks and installing windows, but soon gave up the job, believing the pay was too low. He began working as a labourer at the Waldenmaier armament factory in Heidenheim, commuting by train or by bike from Königsbronn. While working there, he began a friendship with a fellow employee, Maria Schmauder.
In 1938, Elser's parents bought half of a double house together with their son Leonhard and his wife. Elser felt cheated, and was forced to move out of the house, severing ties with his family except for his sister Maria in Stuttgart. In May 1939, he moved in with the Schmauder family in nearby Schnaitheim.
At Waldenmaier, Elser worked in the shipping department and had access to many parts of the plant, including the "special department" where fuses and detonators were produced. After his arrest and confession, Elser told the Gestapo: "Before the decision to take my action in the fall of 1938, I had stolen neither parts nor powder from the factory."
Ideology and religion
Elser, a carpenter and cabinet-maker by trade, became a member of the left-leaning Federation of Woodworkers Union. He also joined the Red Front-Fighters' Association, although he told his interrogators in 1939 that he attended a political assembly no more than three times while a member. He also stated that he voted for the Communist Party until 1933, as he considered the KPD the best defender of workers' interests. There is evidence that Elser opposed Nazism from the beginning of the regime in 1933; he refused to perform the Hitler salute, did not join others in listening to Hitler's speeches broadcast on the radio, and did not vote in the elections or referendums during the Nazi era.
Elser met Josef Schurr, a Communist from Schnaitheim, at a Woodworkers Union meeting in Königsbronn in 1933. Elser had extreme views, supported by a letter that Schurr sent to a newspaper in Ulm in 1947 which stated that Elser "was always extremely interested in some act of violence against Hitler and his cronies. He always called Hitler a 'gypsy'—one just had to look at his criminal face."
Elser's parents were Protestant, and he attended church with his mother as a child, though his attendance lapsed. His church attendance increased during 1939 - after he had decided to carry out the assassination attempt - either at a Protestant or Roman Catholic church. He claimed that church attendance and the recitation of the Lord's Prayer calmed him. He told his arresting officers: "I believe in the survival of the soul after death, and I also believed that I would not go to heaven if I had not had an opportunity to prove that I wanted good. I also wanted to prevent by my act even greater bloodshed."
Prelude
Motive
During four days of interrogation in Berlin (19–22 November 1939), Elser articulated his motive to his interrogators:
Five years later in Dachau concentration camp, SS officer Lechner claimed Elser revealed his motive to him:
Plot
In order to find out how best to implement his assassination plan, Elser travelled to Munich by train on 8 November 1938, the day of Hitler's annual speech on the anniversary of the Beer Hall Putsch. Elser was not able to enter the Bürgerbräukeller until 10:30 p.m., when the crowd had dispersed. He stayed until midnight before going back to his lodging. The next morning, he returned to Königsbronn. On the following day, 10 November, the anti-Jewish violence of the Kristallnacht took place in Munich.
"In the following weeks I slowly concocted in my mind that it was best to pack explosives in the pillar directly behind the speaker's podium," Elser told his interrogators a year later. He continued to work in the Waldenmaier armament factory in Heidenheim and systematically stole explosives, hiding packets of powder in his bedroom. Realising he needed the exact dimensions of the column to build his bomb he returned to Munich, staying 4–12 April 1939. He took a camera with him, a Christmas gift from Maria Schmauder. He had just become unemployed due to an argument with a factory supervisor.
In April–May 1939, Elser found a labouring job at the Vollmer quarry in Königsbronn. While there, he collected an arsenal of 105 blasting cartridges and 125 detonators, causing him to admit to his interrogators, "I knew two or three detonators were sufficient for my purposes, but I thought the surplus will increase the explosive effect." Living with the Schmauder family in Schnaitheim he made many sketches, telling his hosts he was working on an "invention".
In July, in a secluded orchard owned by his parents, Elser tested several prototypes of his bomb. Clock movements given to him in lieu of wages when leaving Rothmund in Meersburg in 1932 and a car indicator "winker" were incorporated into the "infernal machine". In August, after a bout of sickness, he left for Munich. Powder, explosives, a battery and detonators filled the false bottom of his wooden suitcase. Other boxes contained his clothes, clock movements and the tools of his trade.
The Bürgerbräukeller
Elser arrived in Munich on 5 August 1939. Using his real name, he rented a room in the apartments of two unsuspecting couples, at first staying with the Baumanns and from 1 September, Alfons and Rosa Lehmann. He soon became a regular at the Bürgerbräukeller restaurant for his evening meal. As before, he was able to enter the adjoining Bürgerbräukeller Hall before the doors were locked at about 10:30 p.m.
Over the next two months, Elser stayed all night inside the Bürgerbräukeller 30 to 35 times. Working on the gallery level and using a flashlight dimmed with a blue handkerchief, he started by installing a secret door in the timber panelling to a pillar behind the speaker's rostrum. After removing the plaster behind the door, he hollowed out a chamber in the brickwork for his bomb. Normally completing his work around 2:00–3:00 a.m., he dozed in the storeroom off the gallery until the doors were unlocked at about 6:30 a.m. He then left via a rear door, often carrying a small suitcase filled with debris.
Security was relatively lax at the Bürgerbräukeller. Christian Weber, a veteran from the Beer Hall Putsch and the Munich city councillor, was responsible. However, from the beginning of September, after the outbreak of war with Poland, Elser was aware of the presence of air raid wardens and two "free-running dogs" in the building.
While he worked at night in the Bürgerbräukeller, Elser built his device during the day. He purchased extra parts, including sound insulation, from local hardware stores and became friends with the local master woodworker, Brög, who allowed him use of his workshop.
On the nights of 1–2 November, Elser installed the explosives in the pillar. On 4–5 November, which were Saturday and Sunday dance nights, he had to buy a ticket and wait in the gallery until after 1 a.m. before he could install the twin-clock mechanism that would trigger the detonator. To celebrate the completion of his work, Elser recalled later, "I left by the back road and went to the Isartorplatz where at the kiosk I drank two cups of coffee."
On 6 November, Elser left Munich for Stuttgart to stay overnight with his sister, Maria Hirth, and her husband. Leaving them his tool boxes and baggage, he returned to Munich the next day for a final check. Arriving at the Bürgerbräukeller at 10 p.m., he waited for an opportunity to open the bomb chamber and satisfy himself the clock mechanism was correctly set. The next morning he departed Munich by train for Friedrichshafen via Ulm. After a shave at a hairdresser, he took the 6:30 p.m. steamer to Konstanz.
Bombing
Hitler's escape
The high-ranking Nazis who accompanied Adolf Hitler to the anniversary of the Beer Hall Putsch on 8 November 1939 were Joseph Goebbels, Reinhard Heydrich, Rudolf Hess, Robert Ley, Alfred Rosenberg, Julius Streicher, August Frank, Hermann Esser and Heinrich Himmler. Hitler was welcomed to the platform by Christian Weber.
Unknown to Elser, Hitler had initially cancelled his speech at the Bürgerbräukeller to devote his attention to planning the imminent war with France, but changed his mind and attended after all. As fog was forecast, possibly preventing him from flying back to Berlin the next morning, Hitler decided to return to Berlin the same night by his private train. With the departure from Munich's main station set for 9:30 p.m., the start time of the reunion was brought forward half an hour to 8 p.m. and Hitler cut his speech from the planned two hours to a one-hour duration.
Hitler ended his address to the 3000-strong audience of the party faithful at 9:07 p.m., 13 minutes before Elser's bomb exploded at 9:20 p.m. By that time, Hitler and his entourage had left the Bürgerbräukeller. The bomb brought down part of the ceiling and roof and caused the gallery and an external wall to collapse, leaving a mountain of rubble. About 120 people were still in the hall at the time. Seven were killed (the cashier Maria Henle, Franz Lutz, Wilhem Kaiser, a radio announcer named Weber, Leonhard Reindl, Emil Kasberger, and Eugen Schachta). Another sixty-three were injured, sixteen seriously, with one dying later. All but one of those killed were members of the Nazi Party, of which all but one had been longtime supporters of the ideology.
Hitler did not learn of the attempt on his life until later that night on a stop in Nuremberg. When told of the bombing by Goebbels, Hitler responded, "A man has to be lucky." A little later Hitler had a different spin, saying, "Now I am completely at peace! My leaving the Bürgerbräukeller earlier than usual is proof to me that Providence wants me to reach my goal."
Honoring the victims
In Munich on 9 November, the annual guard of honour for the sixteen "blood martyrs" of the NSDAP who died in the Beer Hall Putsch of 1923 was held at the Feldherrnhalle as usual. Two days later, at the same location, an official ceremony for the victims of the Bürgerbräukeller bombing took place. Hitler returned from Berlin to stand before seven flag-draped coffins as Rudolf Hess addressed the SA guard, the onlookers, and listeners to Grossdeutsche Rundfunk ("Greater German Radio"). In his half-hour oration, Hess was not short on hyperbole:
After "Der gute Kamerad" was played, Hitler placed a wreath of chrysanthemums on each coffin, then stepped back to lift his arm in the Nazi salute. The very slow playing of "Deutschland über alles" ended the solemn ceremony.
Arrest
At 8:45 p.m. on the night of 8 November, Elser was apprehended by two border guards, from the Swiss border fence in Konstanz. When taken to the border control post and asked to empty his pockets he was found to be carrying wire cutters, numerous notes and sketches pertaining to explosive devices, firing pins and a blank colour postcard of the interior of the Bürgerbräukeller. At 11 p.m., during Elser's interrogation by the Gestapo in Konstanz, news of the bombing in Munich arrived by teleprinter. The next day, Elser was transferred by car to Munich Gestapo Headquarters.
Investigation
While still returning to Berlin by train, Hitler ordered Heinrich Himmler to put Arthur Nebe, head of Kripo (Criminal Police), in charge of the investigation into the Munich bombing. Himmler did this, but also assigned total control of the investigation to the chief of the Gestapo, Heinrich Müller. Müller immediately ordered the arrest of all Bürgerbräukeller personnel, while Nebe ran the onsite investigation, sifting through the debris.
Nebe had early success, finding the remains of brass plates bearing patent numbers of a clock maker in Schwenningen, Baden-Würtemberg. Despite the clear evidence of the German make, Himmler released to the press that the metal parts pointed to "foreign origin".
Himmler offered a reward of 500,000 marks for information leading to the capture of the culprits, and the Gestapo was soon deluged with hundreds of suspects. When one suspect was reported to have detonator parts in his pockets, Otto Rappold of the counter-espionage arm of the Gestapo sped to Königsbronn and neighbouring towns. Every family member and possible acquaintance of Elser was rounded up for interrogation.
At the Schmauder residence in Schnaitheim, 16-year-old Maria Schmauder told of her family's recent boarder who was working on an "invention", had a false bottom in his suitcase, and worked at the Vollmer quarry.
Interrogation in Munich
On 9 November, as only one of many suspects being held at Munich Gestapo Headquarters, Elser did not attract much attention for a few days, but when face-to-face meetings took place with Bürgerbräukeller staff, waitress Maria Strobl identified Elser as the odd customer who never ordered more than one drink. Later, on the basis of his Swabian accent, Elser was identified by a storekeeper as the man to whom he had sold a "sound proofing insulation plate" to deaden the sound of ticking clocks.
Nebe called in Franz Josef Huber, head of the Gestapo in Vienna, to assist. Huber had the idea of asking Elser to bare his knees. When he did, they were found to be badly bruised, the apparent result of working at low level during his night work at the Bürgerbräukeller.
Dr Albrecht Böhme, head of the Munich Kripo, was witness to a severe and prolonged beating of Elser, in which he said Himmler participated. He later recalled: "But Elser, who was groaning and bleeding profusely from the mouth and nose, made no confession; he would probably not have been physically able to, even if he had wanted to." However, on 15 November, Elser made a full written confession, though the document did not survive the war.
Interrogation in Berlin
Elser was transferred to Berlin Gestapo Headquarters on Prinz Albrecht Strasse, possibly on 18 November. His parents, siblings and their spouses, together with his former girlfriend Elsa Härlen, were taken by train to Berlin to be held in Moabit prison and then in the grand Hotel Kaiserhof. His mother, sister Maria Hirth, brother-in-law Karl Hirth and Elsa Härlen were interrogated in the presence of Elser.
In 1950, Elsa Härlen recollected:
Härlen was left in no doubt that Elser was only repeating what his interrogators wanted him to say. Apart from Maria Hirth and her husband, who were considered accomplices and imprisoned for over one year, the family members and Härlen were allowed to return home. While in Berlin, Härlen received special attention, being interviewed by Heinrich Himmler, having an audience with Adolf Hitler, and being quizzed by Martin Bormann. However, she did not help their cause, which was to find some fragment of evidence that Elser had not acted alone.
While in Berlin, Elser made five full-size drawings of the design of his bomb in order to persuade his interrogators that he was the sole instigator of the assassination attempt. These drawings are referred to in the Gestapo interrogation report, but have not survived.
Interrogation report
Five days of torture, 19–23 November, produced the Gestapo Protokoll (interrogation report). The document was signed off by Kappler, Schmidt and Seibold for the "Kriminalkommissare". Buried in the German archives in Koblenz until 1964, this report is now considered the most important source of information on Elser. The report did not mention the interrogation of Elser's family members and Elsa Härlen in Berlin, as the report contains only the answers Elser gave to his interrogators. On the vital question that he was the sole instigator, Elser had this to say:
When Himmler read the final report, he flew into a rage and scrawled in green ink on the red cover: "What idiot wrote this?"
Nazi propaganda
Discarding the interrogation report that found Elser solely responsible, Hitler proceeded to use the Bürgerbräukeller bombing for propaganda purposes. On 22 November, German newspapers were filled with the story that the assassin, Georg Elser, had been funded by the British Intelligence Service, while the organiser of the crime was Otto Strasser. Photos of two British SIS officers, Richard Henry Stevens and Sigismund Payne Best, captured in the Venlo Incident on 9 November 1939, shared the front page of Deutsche Allgemeine Zeitung with a photo of Georg Elser.
SS officer Walter Schellenberg later wrote in his memoirs (The Labyrinth):
The Swiss magazine Appenzeller Zeitung reported on 23 November 1939 that Otto Strasser had denied any knowledge of Elser, Best or Stevens in an interview in Paris. On 13 November, Swiss authorities had expelled Strasser from Switzerland, after he was found to have made disparaging remarks about Hitler in a foreign newspaper in October.
Torture, drugs and hypnosis
The basement cells of the Berlin Gestapo Headquarters were notorious for the inhumane treatment of prisoners. It was rumoured Elser was kept imprisoned on the top floor until January or February 1941.
Arthur Nebe told Hans Gisevius of Elser's frayed state during this period. Gisevius wrote later,
Walter Schellenberg wrote of a conversation with Heinrich Müller, who told him,
Three days later, Schellenberg heard from Müller that three doctors had worked on Elser for twenty-four hours, injecting him with "sizable quantities of Pervitin", but he continued to say the same thing. Four hypnotists were summoned. Only one could put Elser into a trance, but the prisoner stuck to the same story. The psychologist wrote in his report that Elser was a "fanatic" and had a pathological desire for recognition. He concluded by saying pointedly: Elser had the "drive to achieve fame by eliminating Hitler and simultaneously liberating Germany from the 'evil of Hitler.'"
Reconstruction of the bomb
While at Berlin Gestapo Headquarters, Müller put Elser into a workshop and ordered him to reconstruct the explosive device he used at the Bürgerbräukeller. When Reinhard Heydrich and Walter Schellenberg visited Elser in the workshop, Schellenberg noted, "He [Elser] responded to questioning only with reluctance but opened up when he was praised for his craftsmanship. Then he would comment on his reconstructed model in detail and with great enthusiasm."
Elser's reconstruction of his Bürgerbräukeller bomb was held in such high regard by the Gestapo, they adopted it into their field manuals for training purposes.
Aftermath
Consequences for associates
The day after the bombing at the Bürgerbräukeller, outraged SS guards at Buchenwald Concentration Camp took revenge. Twenty-one Jews were killed by firing squad, and all Jews in the camp suffered three days of food deprivation.
The Gestapo descended on the village of Königsbronn to interrogate the inhabitants, asking the same questions over and over for months on end. The village was stigmatized as a nest of criminals and became known as "Assassinville". Elsewhere, everyone who might have had contact with Elser was hunted down and interrogated by the Gestapo.
The quarry owner Georg Vollmer and his employees were severely beaten during Gestapo interrogations. Sentenced to 20 years in Welzheim concentration camp for negligence in dealing with explosive materials, Vollmer was released in 1941 after his wife petitioned Rudolf Hess through old connections. Losing her mind in fear her husband would be taken away again, she died six months after his release. Prior to her death she started a rumor that a Zurich music dealer named Kuch, with a group of three Communists, had put Elser up to the assassination attempt.
Waldenmaier, the owner of the Waldenmaier armaments factory in Heidenheim, was more fortunate than Vollmer. With the backing of the Abwehr in 1944, he received the War Service Cross First Class for significant contributions to the war effort. In 1940, a Gestapo man had told him: "In spite of repeated torture, Elser had stuck to his story that he had carried out the attack in order to save the working people and the entire world from war."
The Munich locksmith Max Niederholer, who had unwittingly supplied Elser with metal parts, was bound and beaten and detained for two weeks by the Gestapo. Being born in London did not help his case. Maria Schmauder's father was subjected to lengthy interrogation, particularly as Elser had admitted to listening to foreign radio stations in his house—even though that practice was not banned until 1 September 1939. Mathilde Niedermann was interrogated over several nights by the Gestapo in 1939. She maintained that Elser was "completely uninterested in politics", even though it was in Konstanz that he became friendly with Communists. Almost sixty years later, Mathilde and Elser's son, Manfred Buhl, spoke at the dedication of the Georg-Elser-Platz in Munich in 1997—the same year he died.
Elser's lover Elsa Härlen said Elser "led a double life and completely separated his political life from his private life". In an interview in 1959, she said she did not want any restitution from the government of the Federal Republic, as "it was those gypsies that were there before" — meaning the Nazis — that had brought her harm. Generally his family had difficulty coming to terms with his confession as the sole instigator. In 1950, his mother continued to lay the blame on others saying: "I don't think my son would come up with anything like that on his own".
Imprisonment
Elser never faced a trial for the bombing of the Bürgerbräukeller. After his year of torment at Berlin Gestapo Headquarters, he was kept in special custody in Sachsenhausen concentration camp between early 1941 and early 1945. At Sachsenhausen, Elser was held in isolation in a T-shaped building reserved for protected prisoners. Accommodated in three joined cells, each 9.35 m2, there was space for his two full-time guards and a work space to make furniture and other things, including several zithers.
Elser's apparent preferential treatment, which included extra rations and daily visits to the camp barber for a shave, aroused interest amongst other prisoners, including British SIS officer Payne Best. He wrote later that Elser was also allowed regular visits to the camp brothel. Martin Niemöller was also a special inmate in the Sachsenhausen "bunker" and believed the rumours that Elser was an SS man and an agent of Hitler and Himmler. Elser kept a photo of Elsa Härlen in his cell. In early 1945, Elser was transferred to the bunker at Dachau concentration camp.
Death
In April 1945, with German defeat imminent, the Nazis' intention of staging a show trial over the Bürgerbräukeller bombing had become futile. Hitler ordered the execution of special security prisoner "Eller" — the name used for Elser in Dachau — along with Wilhelm Canaris, Dietrich Bonhoeffer and others who had plotted against him. The order, dated 5 April 1945, from the Gestapo HQ in Berlin, was addressed to the Commandant of the Dachau concentration camp, SS-Obersturmbannführer Eduard Weiter.
The order came into the possession of Captain S. Payne Best in May 1945, and appeared in Best's book, The Venlo Incident. That part of the order relating to Elser reads:
The signature on the order was illegible, according to Best.
In his 1947 book, To The Bitter End, Hans Bernd Gisevius commented on the order:
On 9 April 1945, four weeks before the end of the war in Europe, Georg Elser was shot dead and his fully dressed body immediately burned in the crematorium of Dachau Concentration Camp. He was 42 years old.
In 1954, SS-Oberscharführer Theodor Bongartz, the man in charge of the crematorium at Dachau, was determined to have been the murderer of Georg Elser, during a German court proceeding in which SS-Unterscharführer Edgar Stiller was on trial as an accessory to murder. As the SS man in charge of the special prisoners at Dachau from 1943 to 1945, Stiller was accused of escorting Elser to the crematorium where he was allegedly shot by Bongartz. Theodor Bongartz was not brought to account as he had died of an illness in 1945.
A plaque dedicated to Elser's memory in Königsbronn says:
Conspiracy theories
Elser has been the subject of rumours and various conspiracy theories since the Bürgerbräukeller bombing. After the war, Protestant pastor and theologian Martin Niemöller, also in custody in the "bunker" at Sachsenhausen, gave credence to the rumour that Elser had been a member of the SS and that the whole assassination attempt had been staged by the Nazis to portray Hitler as being protected by Providence. Many others, like quarry owner Georg Vollmer, building on his dead wife's contribution, weighed in with their version of the truth. In 1948, Allen Welsh Dulles, the future Director of Central Intelligence (de facto head of the U.S. Central Intelligence Agency) summed up a range of conspiracy theories when he wrote:
In 1969, historical research by Anton Hoch based on The Gestapo Protokoll (interrogation report) dated 19–23 November 1939, found that Elser had acted alone and there was no evidence to involve the Nazi regime or any outside group in the assassination attempt.
Memorials
In contrast to the conspirators of the 20 July 1944 assassination attempt on Hitler, Elser was barely acknowledged in the official commemorative culture of the Federal Republic of Germany until the 1990s. A breakthrough to a positive way of looking at Elser came with the publication of a biography by Hellmut G. Haasis in 1999 followed by an expanded and revised edition in 2009. Since 2001, every two years the Georg-Elser Prize is awarded for courage, and on the occasion of Elser's 100th Birthday in January 2003, Deutsche Post issued a special stamp.
There are at least 60 streets and places named after Elser in Germany and several monuments. Claus Christian Malzahn wrote in 2005: 'That he was for so long ignored by the historians of both East and West Germany, merely goes to show just how long it took Germany to become comfortable with honestly confronting its own history. Johann Georg Elser, though, defied ideological categorization—and for that reason, he is a true German hero.'
In 2008, a music venue called Georg Elser Hallen was demolished in Munich. However, as of 2014, there were five venues in Munich bearing the name Georg Elser Hallen. In 2011, a steel sculpture of Georg Elser was unveiled in Berlin, by German playwright Rolf Hochhuth. The memorial, which cost 200,000 euros, was built on Hochhuth's initiative, after the city authorities dismissed the project as too expensive. In the end, the Berlin state senate financed the Elser sculpture. In September 1979, the Bürgerbräukeller was demolished. On its site now stands the GEMA Building, the Gasteig Cultural Centre and the Munich City Hilton Hotel. A plaque in the pavement at the entrance to the GEMA Building marks the position of the pillar that concealed Elser's bomb. 8. November 1939 is the name of the Johann Georg Elser Memorial in Munich to commemorate the resistance fighters fighting against the Nazis. The monument is located in the Maxvorstadt district.
The story of Elser is commemorated in the 1989 film Seven Minutes () directed by Klaus Maria Brandauer, and the 2015 film 13 Minutes (), directed by Oliver Hirschbiegel.
The Georg Elser Prize was established in 2001. It is awarded every two years to individuals who have demonstrated civil courage or civil disobedience against injustice committed by the state.
See also
List of assassination attempts on Adolf Hitler
References
Notes
Further reading
Evans, Richard J., The Third Reich at War. Penguin Press, 2008, pp. 109–111.
Moorhouse, Roger, Killing Hitler: The Third Reich and the Plots against the Führer. Jonathan Cape, 2006, pp. 36–58.
Steinbach, Peter and Tuchel, Johannes, Georg Elser. Der Hitler-Attentäter. Berlin: Be.bra-Wissenschafts-Verlag, 2008, .
Tom Ferry, "GEORG ELSER: The Zither Player", 2016,
External links
Tom Ferry, Georg Elser Detailed documentation at georgelser.info
Georg Elser und das Attentat vom Bürgerbräukeller 1939 at shoa.de
Heilige – Selige – Ehrwürdige – Namen – Geschichten – Ökumenisches Heiligenlexikon at heiligenlexikon.de
"The Carpenter Elser Versus the Führer Hitler" Der Spiegel, 8 November 2005
Mike Dash, One Man Against Tyranny , 18 August 2011
1903 births
1945 deaths
People from Heidenheim (district)
People from the Kingdom of Württemberg
Rotfrontkämpferbund members
Executed failed assassins of Adolf Hitler
Executed communists in the German Resistance
German people who died in Dachau concentration camp
Resistance members who died in Nazi concentration camps
Protestants in the German Resistance
People executed by Nazi Germany by firearm
People from Baden-Württemberg executed in Nazi concentration camps
German carpenters |
Only the Brave is a 2017 American biographical drama film directed by Joseph Kosinski, and written by Ken Nolan and Eric Warren Singer, based on the GQ article "No Exit" by Sean Flynn. The film tells the story of the Granite Mountain Hotshots, an elite crew of firefighters from Prescott, Arizona who lost 19 of 20 members while fighting the Yarnell Hill Fire in June 2013, and is dedicated to their memory. It features an ensemble cast, including Josh Brolin, James Badge Dale, Jeff Bridges, Miles Teller, Alex Russell, Taylor Kitsch, Ben Hardy, Thad Luckinbill, Geoff Stults, Scott Haze, Andie MacDowell, and Jennifer Connelly.
Principal photography began in New Mexico in June 2016. Only the Brave was released by Columbia Pictures in North America and by Summit Entertainment in other territories on October 20, 2017. The film was a box-office bomb, grossing just $26.3 million worldwide against a $38 million budget. However, it received positive reviews, with praise for the cast and the film's touching tribute to its subjects. The film is dedicated to the Granite Mountain Hotshots and their families.
Plot
Fire and Rescue Crew 7 of Prescott, Arizona, superintended by Eric Marsh, responds to a wildfire. Arriving on scene, Eric predicts that the fire will threaten a local neighborhood, but is stood down by the assigned hotshot superintendent. Eric's prediction comes true, leading to him venting his frustration to fire chief and close friend Duane Steinbrink that when the next wildfire threatens Prescott, his crew will not be allowed to fight it directly, as they lack hotshot certification. Eric asks for his help in obtaining his crew's evaluation which comes from Prescott Mayor Worthington who gives Eric until the end of the fire season to get and pass his evaluation.
Meanwhile, a 21-year old outcast named Brendan McDonough learns that his ex-girlfriend Natalie is pregnant with his child. Natalie however forbids Brendan from seeing her. Brendan is later arrested for larceny, which prompts his mother to evict him from her house. Determined to provide support for his newborn daughter, Brendan interviews with Eric, who gives him a chance. The crew are finally given an evaluation during a wildfire deployment. Eric briefs his crew on how they plan to stop the fire but is criticized by their evaluator, Hayes, who Eric bluntly disregards. Despite Hayes' concerns, the fire is halted by Eric's strategy, and the crew are made into the Granite Mountain Hotshots.
The crew's season becomes much more demanding as Granite Mountain are deployed across the country. Brendan is accepted by Natalie and is allowed to see his daughter. After being bitten by a rattlesnake during a fire in Prescott, Brendan approaches Eric about transferring to structural firefighting. Eric snaps at Brendan, stating that nobody on the structure side would hire an ex-con like him. Eric later has an argument with his wife Amanda, who has long resented Eric's obsession with his job and placing his wife's desire to start a family with him on indefinite hold. After a heart to heart with Duane, Eric returns to Amanda, announcing that he is ready to settle down.
The Granite Mountain Hotshots are called to the Yarnell Hill Fire; on the way, Eric announces to his Captain, Jesse Steed that he will step down as superintendent after this season and offers Jesse his job if he wants it. Eric also apologizes to Brendan for snapping at him and promises that he will help him as much as he can to ensure his transfer. The crew assembles a controlled burn, but is doused mistakenly by an airtanker. Sending Brendan as lookout, the remaining crew moves to find another suitable location. The wind suddenly shifts, and the fire jumps a trigger point. Brendan is rescued by another hotshot crew whilst the others relocate to a safe zone, but the fire moves too fast and is too intense to tackle safely.
The wind picks up again, jumping Granite Mountain's safe zone; cutting them off from escape. Clearing a small site quickly, Eric attempts to douse the blaze with an overhead airtanker, but it misses the drop zone. The crew deploys under their fire shelters as the fire sweeps over them. Brendan listens into the radio traffic, and is devastated when a call finally comes in from the first responders who arrive at their shelters; all 19 of his crewmates have perished. Though the information is treated sensitively, rumors flare amongst the devastated families of what has happened. Brendan, being the sole survivor demands to meet with them at the gather point at Prescott Middle School. Upon arriving, the families worst fears are confirmed, with Brendan suffering a psychological breakdown due to survivors guilt.
Three years later, Brendan takes his daughter to a juniper tree Granite Mountain had earlier saved, now adorned with the crewmembers memorabilia and photographs. The ending credits detail that the film is dedicated to the lives lost at the Yarnell Hill Fire, being the largest loss of firefighters since the September 11th attacks, as well as photos of the actual Granite Mountain crewmembers.
Cast
Production
On March 1, 2016, Josh Brolin and Miles Teller joined the cast of the film. Jeff Bridges and Taylor Kitsch later also joined the cast. The film was produced under the working title Granite Mountain. Principal photography on the film began in New Mexico on June 13, 2016. Filming took place at different locations in and around Santa Fe and Los Alamos.
Joseph Trapanese composed the film's score. Dierks Bentley released a single called "Hold The Light", featuring S. Carey. The single and the music video was released on October 6, 2017.
Release
Only the Brave, originally titled Granite Mountain, was released on October 20, 2017, by Sony Pictures Releasing under its Columbia Pictures label. Before that the film was set a release date for September 22, 2017, but a disagreement between Lionsgate and production company Black Label Media saw the U.S. distribution rights change to Columbia Pictures. Summit Entertainment will retain international rights in select countries for the film. The trailer came out on July 19, and the film was retitled Only the Brave. The film was released digitally on January 23, 2018, and on DVD and Blu-ray on February 6, 2018. As of December 2018, it had made $7.2 million in home video sales.
Reception
Box office
Only the Brave grossed $18.3 million in the United States and Canada, and $7.4 million in other territories, for a worldwide total of $25.8 million, against a production budget of $38 million.
In the United States and Canada, Only the Brave was released alongside Boo 2! A Madea Halloween, The Snowman and Geostorm, and was expected to gross around $7 million from 2,575 theaters in its opening weekend. It made $305,000 from Thursday night previews and $2.1 million on its first day. It ended up debuting to $6 million, finishing 5th at the box office. In its second week the film dropped 42.5% to $3.4 million, finishing 7th.
Critical response
On review aggregator Rotten Tomatoes, the film has an approval rating of 87% based on 160 reviews, with an average rating of 7.09/10. The website's critical consensus reads, "Only the Braves impressive veteran cast and affecting fact-based story add up to a no-frills drama that's just as stolidly powerful as the real-life heroes it honors." On Metacritic, which assigns a weighted average rating to reviews, the film has a weighted average score of 72 out of 100, based on 35 critics, indicating "generally favorable reviews". Audiences polled by CinemaScore gave the film an average grade of "A" on an A+ to F scale.
Bilge Ebiri of Village Voice wrote, "Only the Brave is a visually splendid, spellbinding, and surreal movie that also happens to be an emotionally shattering, over-the-top ugly-cry for the ages." Todd McCarthy of The Hollywood Reporter called the film "an engaging account of a tragic real-life story."
Richard Roeper of the Chicago Sun-Times gave the film 3.5 out of 4 stars, saying: "The blending of practical effects and CGI is impressive, and we come to understand the risks these men are taking, but some of the techniques and approaches they take remain a mystery, up to and through the climactic fire. Not that we need a manual to understand these men were working-class, everyday heroes." Scott Menzel of We Live Entertainment also praised the film, saying, "Only the Brave is without question the best firefighter film since Backdraft and one that pays tribute to the brave men that sacrificed their own lives to protect thousands of others."
Richard Brody noted in The New Yorker that "Only the Brave ties the characters’ private lives to their work lives in a plethora of details, but it never looks beyond the work life into life at large, or even into the life that surrounds them in their own home town." Brody described the film as a missed opportunity to depict those who battled local politicians to secure benefits for survivors of the Yarnell Hill Fire and the widows of the deceased Hotshots. The review quoted Fernanda Santos in The New York Times who wrote that "Juliann Ashcraft decided to leave Prescott altogether to spare her four children the discomfort of whispers and glares" — a reference to the harassment of women who challenged the decision to treat victims differently based on their employment status.
Accolades
References
External links
2017 films
2017 drama films
2017 biographical drama films
American biographical drama films
Black Label Media films
Columbia Pictures films
Di Bonaventura Pictures films
Drama films based on actual events
Films about firefighting
Films about wildfires
Films based on newspaper and magazine articles
Films directed by Joseph Kosinski
Films produced by Lorenzo di Bonaventura
Films scored by Joseph Trapanese
Films set in 2013
Films set in Prescott, Arizona
Films shot in New Mexico
Films with screenplays by Ken Nolan
Lionsgate films
Summit Entertainment films
2010s English-language films
2010s American films |
```xml
import { svgDefaultProps } from '@nivo/bar'
import {
themeProperty,
motionProperties,
defsProperties,
getLegendsProps,
groupProperties,
} from '../../../lib/componentProperties'
import {
chartDimensions,
ordinalColors,
chartGrid,
axes,
isInteractive,
commonAccessibilityProps,
} from '../../../lib/chart-properties'
import { ChartProperty, Flavor } from '../../../types'
const allFlavors: Flavor[] = ['svg', 'canvas', 'api']
const props: ChartProperty[] = [
{
key: 'data',
group: 'Base',
help: 'Chart data.',
type: 'object[]',
required: true,
flavors: allFlavors,
},
{
key: 'indexBy',
group: 'Base',
help: 'Key to use to index the data.',
description: `
Key to use to index the data,
this key must exist in each data item.
You can also provide a function which will
receive the data item and must return the desired index.
`,
type: 'string | (datum: RawDatum): string | number',
flavors: allFlavors,
required: false,
defaultValue: svgDefaultProps.indexBy,
},
{
key: 'keys',
group: 'Base',
help: 'Keys to use to determine each serie.',
type: 'string[]',
flavors: allFlavors,
required: false,
defaultValue: svgDefaultProps.keys,
},
{
key: 'groupMode',
group: 'Base',
help: `How to group bars.`,
type: `'grouped' | 'stacked'`,
flavors: allFlavors,
required: false,
defaultValue: svgDefaultProps.groupMode,
control: {
type: 'radio',
choices: [
{ label: 'stacked', value: 'stacked' },
{ label: 'grouped', value: 'grouped' },
],
},
},
{
key: 'layout',
group: 'Base',
help: `How to display bars.`,
type: `'horizontal' | 'vertical'`,
flavors: allFlavors,
required: false,
defaultValue: svgDefaultProps.layout,
control: {
type: 'radio',
choices: [
{ label: 'horizontal', value: 'horizontal' },
{ label: 'vertical', value: 'vertical' },
],
},
},
{
key: 'valueScale',
group: 'Base',
type: 'object',
help: `value scale configuration.`,
defaultValue: svgDefaultProps.valueScale,
flavors: allFlavors,
required: false,
control: {
type: 'object',
props: [
{
key: 'type',
help: `Scale type.`,
type: 'string',
required: true,
flavors: allFlavors,
control: {
type: 'choices',
disabled: true,
choices: ['linear', 'symlog'].map(v => ({
label: v,
value: v,
})),
},
},
],
},
},
{
key: 'indexScale',
group: 'Base',
type: 'object',
help: `index scale configuration.`,
defaultValue: svgDefaultProps.indexScale,
flavors: allFlavors,
required: false,
control: {
type: 'object',
props: [
{
key: 'type',
help: `Scale type.`,
type: 'string',
required: true,
flavors: ['svg', 'canvas', 'api'],
control: {
type: 'choices',
disabled: true,
choices: ['band'].map(v => ({
label: v,
value: v,
})),
},
},
{
key: 'round',
required: true,
flavors: ['svg', 'canvas', 'api'],
help: 'Toggle index scale (for bar width) rounding.',
type: 'boolean',
control: { type: 'switch' },
},
],
},
},
{
key: 'reverse',
group: 'Base',
help: 'Reverse bars, starts on top instead of bottom for vertical layout and right instead of left for horizontal one.',
type: 'boolean',
required: false,
flavors: allFlavors,
defaultValue: svgDefaultProps.reverse,
control: { type: 'switch' },
},
{
key: 'minValue',
group: 'Base',
help: 'Minimum value.',
description: `
Minimum value, if 'auto',
will use min value from the provided data.
`,
required: false,
flavors: allFlavors,
defaultValue: svgDefaultProps.minValue,
type: `number | 'auto'`,
control: {
type: 'switchableRange',
disabledValue: 'auto',
defaultValue: -1000,
min: -1000,
max: 0,
},
},
{
key: 'maxValue',
group: 'Base',
help: 'Maximum value.',
description: `
Maximum value, if 'auto',
will use max value from the provided data.
`,
required: false,
flavors: allFlavors,
defaultValue: svgDefaultProps.maxValue,
type: `number | 'auto'`,
control: {
type: 'switchableRange',
disabledValue: 'auto',
defaultValue: 1000,
min: 0,
max: 1000,
},
},
{
key: 'valueFormat',
group: 'Base',
help: 'Optional formatter for values.',
description: `
The formatted value can then be used for labels & tooltips.
Under the hood, nivo uses [d3-format](path_to_url
please have a look at it for available formats, you can also pass a function
which will receive the raw value and should return the formatted one.
`,
required: false,
flavors: allFlavors,
type: 'string | (value: number) => string | number',
control: { type: 'valueFormat' },
},
{
key: 'padding',
help: 'Padding between each bar (ratio).',
type: 'number',
required: false,
flavors: allFlavors,
defaultValue: svgDefaultProps.padding,
group: 'Base',
control: {
type: 'range',
min: 0,
max: 0.9,
step: 0.05,
},
},
{
key: 'innerPadding',
help: 'Padding between grouped/stacked bars.',
type: 'number',
required: false,
flavors: allFlavors,
defaultValue: svgDefaultProps.innerPadding,
group: 'Base',
control: {
type: 'range',
unit: 'px',
min: 0,
max: 10,
},
},
...chartDimensions(allFlavors),
themeProperty(allFlavors),
ordinalColors({
flavors: allFlavors,
defaultValue: svgDefaultProps.colors,
}),
{
key: 'colorBy',
type: `'id' | 'indexValue'`,
help: 'Property used to determine node color.',
description: `
Property to use to determine node color.
`,
flavors: allFlavors,
required: false,
defaultValue: svgDefaultProps.colorBy,
group: 'Style',
control: {
type: 'choices',
choices: [
{
label: 'id',
value: 'id',
},
{
label: 'indexValue',
value: 'indexValue',
},
],
},
},
{
key: 'borderRadius',
help: 'Rectangle border radius.',
type: 'number',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.borderRadius,
group: 'Style',
control: {
type: 'range',
unit: 'px',
min: 0,
max: 36,
},
},
{
key: 'borderWidth',
help: 'Width of bar border.',
type: 'number',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.borderWidth,
group: 'Style',
control: { type: 'lineWidth' },
},
{
key: 'borderColor',
help: 'Method to compute border color.',
description: `
how to compute border color,
[see dedicated documentation](self:/guides/colors).
`,
type: 'string | object |Function',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.borderColor,
group: 'Style',
control: { type: 'inheritedColor' },
},
...defsProperties('Style', ['svg']),
{
key: 'layers',
flavors: ['svg', 'canvas'],
help: 'Defines the order of layers.',
description: `
Defines the order of layers, available layers are:
\`grid\`, \`axes\`, \`bars\`, \`markers\`, \`legends\`,
\`annotations\`. The \`markers\` layer is not available
in the canvas flavor.
You can also use this to insert extra layers to the chart,
this extra layer must be a function which will receive
the chart computed data and must return a valid SVG
element.
`,
type: 'Array<string | Function>',
required: false,
defaultValue: svgDefaultProps.layers,
group: 'Customization',
},
{
key: 'enableLabel',
help: 'Enable/disable labels.',
type: 'boolean',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.enableLabel,
group: 'Labels',
control: { type: 'switch' },
},
{
key: 'label',
group: 'Labels',
help: 'Define how bar labels are computed.',
description: `
Define how bar labels are computed.
By default it will use the bar's value.
It accepts a string which will be used to access
a specific bar data property, such as
\`'value'\` or \`'id'\`.
You can also use a funtion if you want to
add more logic, this function will receive
the current bar's data and must return
the computed label which, depending on the context,
should return a string or an svg element (Bar) or
a string (BarCanvas). For example let's say you want
to use a label with both the id and the value,
you can achieve this with:
\`\`\`
label={d => \`\${d.id}: \${d.value}\`}
\`\`\`
`,
type: 'string | Function',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.label,
},
{
key: 'labelSkipWidth',
help: 'Skip label if bar width is lower than provided value, ignored if 0.',
type: 'number',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.labelSkipWidth,
group: 'Labels',
control: {
type: 'range',
unit: 'px',
min: 0,
max: 36,
},
},
{
key: 'labelSkipHeight',
help: 'Skip label if bar height is lower than provided value, ignored if 0.',
type: 'number',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.labelSkipHeight,
group: 'Labels',
control: {
type: 'range',
unit: 'px',
min: 0,
max: 36,
},
},
{
key: 'labelTextColor',
help: 'Defines how to compute label text color.',
type: 'string | object | Function',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.labelTextColor,
control: { type: 'inheritedColor' },
group: 'Labels',
},
{
key: 'labelPosition',
help: 'Defines the position of the label relative to its bar.',
type: `'start' | 'middle' | 'end'`,
flavors: allFlavors,
required: false,
defaultValue: svgDefaultProps.labelPosition,
control: {
type: 'radio',
choices: [
{ label: 'start', value: 'start' },
{ label: 'middle', value: 'middle' },
{ label: 'end', value: 'end' },
],
columns: 3,
},
group: 'Labels',
},
{
key: 'labelOffset',
help: 'Defines the vertical or horizontal (depends on layout) offset of the label.',
type: 'number',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.labelOffset,
control: {
type: 'range',
unit: 'px',
min: -16,
max: 16,
},
group: 'Labels',
},
{
key: 'enableTotals',
help: 'Enable/disable totals labels.',
type: 'boolean',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.enableTotals,
group: 'Labels',
control: { type: 'switch' },
},
{
key: 'totalsOffset',
help: 'Offset from the bar edge for the total label.',
type: 'number',
flavors: ['svg', 'canvas', 'api'],
required: false,
defaultValue: svgDefaultProps.totalsOffset,
group: 'Labels',
control: {
type: 'range',
unit: 'px',
min: 0,
max: 40,
},
},
...chartGrid({
flavors: allFlavors,
xDefault: svgDefaultProps.enableGridX,
yDefault: svgDefaultProps.enableGridY,
values: true,
}),
...axes({ flavors: allFlavors }),
isInteractive({
flavors: ['svg', 'canvas'],
defaultValue: svgDefaultProps.isInteractive,
}),
{
key: 'tooltip',
flavors: ['svg', 'canvas'],
group: 'Interactivity',
type: 'Function',
required: false,
help: 'Tooltip custom component',
description: `
A function allowing complete tooltip customisation,
it must return a valid HTML element and will receive
the following data:
\`\`\`
{
bar: {
id: string | number,
value: number,
formattedValue: string,
index: number,
indexValue: string | number,
// datum associated to the current index (raw data)
data: object
},
color: string,
label: string
}
\`\`\`
You can also customize the style of the tooltip
using the \`theme.tooltip\` object.
`,
},
{
key: 'custom tooltip example',
flavors: ['svg', 'canvas'],
group: 'Interactivity',
help: 'Showcase custom tooltip component.',
type: 'boolean',
required: false,
control: { type: 'switch' },
},
{
key: 'onClick',
flavors: ['svg', 'canvas'],
group: 'Interactivity',
type: 'Function',
required: false,
help: 'onClick handler',
description: `
onClick handler, will receive node data as first argument
& event as second one. The node data has the following shape:
\`\`\`
{
id: string | number,
value: number,
formattedValue: string,
index: number,
indexValue: string | number,
color: string,
// datum associated to the current index (raw data)
data: object
}
\`\`\`
`,
},
{
key: 'legends',
flavors: ['svg', 'canvas'],
type: 'object[]',
help: `Optional chart's legends.`,
group: 'Legends',
required: false,
control: {
type: 'array',
props: getLegendsProps(['svg']),
shouldCreate: true,
addLabel: 'add legend',
shouldRemove: true,
getItemTitle: (index: number, legend: any) =>
`legend[${index}]: ${legend.anchor}, ${legend.direction}`,
defaults: {
dataFrom: 'keys',
anchor: 'top-right',
direction: 'column',
justify: false,
translateX: 120,
translateY: 0,
itemWidth: 100,
itemHeight: 20,
itemsSpacing: 2,
symbolSize: 20,
itemDirection: 'left-to-right',
onClick: (data: any) => {
console.log(JSON.stringify(data, null, ' '))
},
},
},
},
...motionProperties(['svg'], svgDefaultProps),
{
key: 'isFocusable',
flavors: ['svg'],
required: false,
group: 'Accessibility',
help: 'Make the root SVG element and each bar item focusable, for keyboard navigation.',
description: `
If enabled, focusing will also reveal the tooltip if \`isInteractive\` is \`true\`,
when a bar gains focus and hide it on blur.
Also note that if this option is enabled, focusing a bar will reposition the tooltip
at a fixed location.
`,
type: 'boolean',
control: { type: 'switch' },
},
...commonAccessibilityProps(['svg']),
{
key: 'barAriaLabel',
flavors: ['svg'],
required: false,
group: 'Accessibility',
help: '[aria-label](path_to_url#aria-label) for bar items.',
type: '(data) => string',
},
{
key: 'barAriaLabelledBy',
flavors: ['svg'],
required: false,
group: 'Accessibility',
help: '[aria-labelledby](path_to_url#aria-labelledby) for bar items.',
type: '(data) => string',
},
{
key: 'barAriaDescribedBy',
flavors: ['svg'],
required: false,
group: 'Accessibility',
help: '[aria-describedby](path_to_url#aria-describedby) for bar items.',
type: '(data) => string',
},
{
key: 'barAriaHidden',
flavors: ['svg'],
required: false,
group: 'Accessibility',
help: '[aria-hidden](path_to_url#aria-hidden) for bar items.',
type: '(data) => boolean',
},
{
key: 'barAriaDisabled',
flavors: ['svg'],
required: false,
group: 'Accessibility',
help: '[aria-disabled](path_to_url#aria-disabled) for bar items.',
type: '(data) => boolean',
},
]
export const groups = groupProperties(props)
``` |
```xml
/*
*
* See the LICENSE file at the top-level directory of this distribution
* for licensing information.
*
* Unless otherwise agreed in a custom licensing agreement with the Lisk Foundation,
* no part of this software, including this file, may be copied, modified,
* propagated, or distributed except according to the terms contained in the
* LICENSE file.
*
* Removal or modification of this copyright notice is prohibited.
*/
import { BaseEvent, EventQueuer } from '../../base_event';
import { TerminatedStateAccount, terminatedStateSchema } from '../stores/terminated_state';
export class TerminatedStateCreatedEvent extends BaseEvent<TerminatedStateAccount> {
public schema = terminatedStateSchema;
public log(ctx: EventQueuer, chainID: Buffer, data: TerminatedStateAccount): void {
this.add(ctx, data, [chainID]);
}
}
``` |
```yaml
category: Database
sectionOrder:
- Connect
- Collect
commonfields:
id: Elasticsearch v2
version: -1
configuration:
- display: Server URL
name: url
required: true
type: 0
additionalinfo: The Elasticsearch server to which the integration connects. Ensure that the URL includes the correct Elasticsearch port. By default this is 9200.
section: Connect
- additionalinfo: Provide Username + Password instead of API key + API ID
display: Username for server login
name: credentials
type: 9
section: Connect
required: false
- display: Trust any certificate (not secure)
name: insecure
type: 8
section: Connect
advanced: true
required: false
- display: Use system proxy settings
name: proxy
type: 8
section: Connect
advanced: true
required: false
- defaultvalue: Elasticsearch
additionalinfo: In some hosted ElasticSearch environments, the standard ElasticSearch client is not supported. If you encounter any related client issues, please consider using the OpenSearch client type.
display: Client type
name: client_type
options:
- Elasticsearch
- OpenSearch
type: 15
section: Connect
advanced: true
required: false
- display: Index from which to fetch incidents (CSV)
name: fetch_index
type: 0
section: Collect
required: false
- display: Query String
name: fetch_query
type: 0
additionalinfo: The query will be used when fetching incidents. Index time field will be used as a filter in the query.
section: Collect
advanced: true
required: false
- display: Index time field (for sorting sort and limiting data)
name: fetch_time_field
type: 0
section: Collect
advanced: true
required: false
additionalinfo: The time field on which sorting and limiting are performed. If using a nested field, separate field names using dot notation.
- display: Raw Query
name: raw_query
type: 12
additionalinfo: Will override the 'Query String' Lucene syntax string. Results will not be filtered.
section: Collect
advanced: true
required: false
- display: Time field type
defaultvalue: 'Simple-Date'
name: time_method
type: 15
options:
- Simple-Date
- Timestamp-Seconds
- Timestamp-Milliseconds
section: Collect
advanced: true
required: false
- defaultvalue: 'true'
display: Map JSON fields into labels
name: map_labels
type: 8
section: Collect
advanced: true
required: false
- defaultvalue: '3 days'
display: First fetch timestamp (<number> <time unit>, e.g., 12 hours, 7 days)
name: fetch_time
type: 0
section: Collect
required: false
- defaultvalue: '50'
display: The maximum number of results to return per fetch.
name: fetch_size
type: 0
section: Collect
required: false
- display: Request timeout (in seconds).
name: timeout
type: 0
defaultvalue: '60'
section: Connect
advanced: true
required: false
- display: Incident type
name: incidentType
type: 13
section: Connect
required: false
- display: Fetch incidents
name: isFetch
type: 8
section: Collect
required: false
description: "Search for and analyze data in real time. \n Supports version 6 and later."
display: Elasticsearch v2
name: Elasticsearch v2
script:
commands:
- arguments:
- description: The index in which to perform a search.
name: index
required: true
- description: The string to query (in Lucene syntax).
name: query
predefined:
- ''
- description: A comma-separated list of document fields to fetch. If empty, the entire document is fetched.
isArray: true
name: fields
- auto: PREDEFINED
defaultValue: 'false'
description: Calculates an explanation of a score for a query. For example, "value:1.6943597".
name: explain
predefined:
- 'true'
- 'false'
- defaultValue: '0'
description: The page number from which to start a search.
name: page
- defaultValue: '100'
description: The number of documents displayed per page. Can be an integer between "1" and "10,000".
name: size
- description: The field by which to sort the results table. The supported result types are boolean, numeric, date, and keyword fields. Keyword fields require the doc_values parameter to be set to "true" from the Elasticsearch server.
name: sort-field
predefined:
- ''
- auto: PREDEFINED
defaultValue: asc
description: The order by which to sort the results table. The results tables can only be sorted if a sort-field is defined.
name: sort-order
predefined:
- asc
- desc
- description: Will overwrite the query' arguments.
name: query_dsl
- description: The starting time of the time range.
name: timestamp_range_start
- description: The ending time of the time range.
name: timestamp_range_end
- description: Timestamp field name.
defaultValue: "@timestamp"
name: timestamp_field
description: Queries an index.
name: es-search
outputs:
- contextPath: Elasticsearch.Search.Results._index
description: The index to which the document belongs.
type: String
- contextPath: Elasticsearch.Search.Results._id
description: The ID of the document.
type: String
- contextPath: Elasticsearch.Search.Results._type
description: The mapping type of the document.
type: String
- contextPath: Elasticsearch.Search.max_score
description: The maximum relevance score of a query.
type: Number
- contextPath: Elasticsearch.Search.Query
description: The query performed in the search.
type: String
- contextPath: Elasticsearch.Search.total.value
description: The number of search results.
type: Number
- contextPath: Elasticsearch.Search.Results._score
description: The relevance score of the search result.
type: Number
- contextPath: Elasticsearch.Search.Index
description: The index in which the search was performed.
type: String
- contextPath: Elasticsearch.Search.Server
description: The server on which the search was performed.
type: String
- contextPath: Elasticsearch.Search.timed_out
description: Whether the search stopped due to a timeout.
type: Boolean
- contextPath: Elasticsearch.Search.took
description: The time in milliseconds taken for the search to complete.
type: Number
- contextPath: Elasticsearch.Search.Page
description: The page number from which the search started.
type: Number
- contextPath: Elasticsearch.Search.Size
description: The maximum number of scores that a search can return.
type: Number
- arguments:
- description: The index in which to perform a search.
name: index
required: true
- description: The string to query (in Lucene syntax).
name: query
predefined:
- ''
- description: A comma-separated list of document fields to fetch. If empty, fetches the entire document.
isArray: true
name: fields
- auto: PREDEFINED
defaultValue: 'false'
description: Calculates an explanation of a score for a query. For example, "value:1.6943597".
name: explain
predefined:
- 'true'
- 'false'
- defaultValue: '0'
description: The page number from which to start a search.
name: page
- defaultValue: '100'
description: The number of documents displayed per page. Can be an integer between "1" and "10,000".
name: size
- description: The field by which to sort the results table. The supported result types are boolean, numeric, date, and keyword fields. Keyword fields require the doc_values parameter to be set to "true" from the Elasticsearch server.
name: sort-field
predefined:
- ''
- auto: PREDEFINED
defaultValue: asc
description: The order by which to sort the results table. The results tables can only be sorted if a sort-field is defined.
name: sort-order
predefined:
- asc
- desc
- description: Timestamp field name.
defaultValue: "@timestamp"
name: timestamp_field
description: Searches an index.
name: search
outputs:
- contextPath: Elasticsearch.Search.Results._index
description: The index to which the document belongs.
type: String
- contextPath: Elasticsearch.Search.Results._id
description: The ID of the document.
type: String
- contextPath: Elasticsearch.Search.Results._type
description: The mapping type of the document.
type: String
- contextPath: Elasticsearch.Search.max_score
description: The maximum relevance score of a query.
type: Number
- contextPath: Elasticsearch.Search.Query
description: The query performed in the search.
type: String
- contextPath: Elasticsearch.Search.total.value
description: The number of search results.
type: Number
- contextPath: Elasticsearch.Search.Results._score
description: The relevance score of the search result.
type: Number
- contextPath: Elasticsearch.Search.Index
description: The index in which the search was performed.
type: String
- contextPath: Elasticsearch.Search.Server
description: The server on which the search was performed.
type: String
- contextPath: Elasticsearch.Search.timed_out
description: Whether the search stopped due to a time out.
type: Boolean
- contextPath: Elasticsearch.Search.took
description: The time in milliseconds taken for the search to complete.
type: Number
- contextPath: Elasticsearch.Search.Page
description: The page number from which the search started.
type: Number
- contextPath: Elasticsearch.Search.Size
description: The maximum number of scores that a search can return.
type: Number
- name: get-mapping-fields
description: Returns the schema of the index to fetch from. This commmand should be used for debugging purposes.
- name: es-integration-health-check
description: Returns the health status of the integration. This commmand should be used for debugging purposes.
- description: Search using EQL query.
name: es-eql-search
arguments:
- description: The index in which to perform a search.
name: index
required: true
- description: The string to query (in Lucene syntax).
name: query
required: true
- description: A comma-separated list of document fields to fetch. If empty, fetches the entire document.
isArray: true
name: fields
- description: If two or more events share the same timestamp, Elasticsearch uses a tiebreaker field value to sort the events in ascending order.
name: sort-tiebreaker
- description: Filter using query DSL.
name: filter
- defaultValue: event.category
description: The event category field.
name: event_category_field
- defaultValue: '100'
description: The number of documents displayed per page. Can be an integer between "1" and "10,000".
name: size
- description: The starting time of the time range.
name: timestamp_range_start
- description: The ending time of the time range.
name: timestamp_range_end
- description: Timestamp field name.
defaultValue: "@timestamp"
name: timestamp_field
outputs:
- contextPath: Elasticsearch.Search.Results._index
description: The index to which the document belongs.
type: String
- contextPath: Elasticsearch.Search.Results._id
description: The ID of the document.
type: String
- contextPath: Elasticsearch.Search.Results._type
description: The mapping type of the document.
type: String
- contextPath: Elasticsearch.Search.max_score
description: The maximum relevance score of a query.
type: Number
- contextPath: Elasticsearch.Search.Query
description: The query performed in the search.
type: String
- contextPath: Elasticsearch.Search.total.value
description: The number of search results.
type: Number
- contextPath: Elasticsearch.Search.Results._score
description: The relevance score of the search result.
type: Number
- contextPath: Elasticsearch.Search.Index
description: The index in which the search was performed.
type: String
- contextPath: Elasticsearch.Search.Server
description: The server on which the search was performed.
type: String
- contextPath: Elasticsearch.Search.timed_out
description: Whether the search stopped due to a timeout.
type: Boolean
- contextPath: Elasticsearch.Search.took
description: The time in milliseconds taken for the search to complete.
type: Number
- contextPath: Elasticsearch.Search.Page
description: The page number from which the search started.
type: Number
- contextPath: Elasticsearch.Search.Size
description: The maximum number of scores that a search can return.
type: Number
- name: es-index
arguments:
- name: index_name
required: true
description: The name of the index to ingest into.
- name: document
required: true
description: The document object (JSON format) to be indexed. See Elasticsearch documentation (path_to_url#ex-index) for further information about indexing documents.
- name: id
description: The ID of the indexed document (will be generated if empty).
outputs:
- contextPath: Elasticsearch.Index.id
description: The ID of the indexed document.
type: string
- contextPath: Elasticsearch.Index.index
description: The name of the index which the document was ingested to.
type: string
- contextPath: Elasticsearch.Index.version
description: The version number of the indexed document.
type: number
- contextPath: Elasticsearch.Index.result
description: The result of the index operation.
type: string
description: Indexes a document into an Elasticsearch index.
dockerimage: demisto/elasticsearch:1.0.0.87483
isfetch: true
runonce: false
script: '-'
subtype: python3
type: python
ismappable: true
autoUpdateDockerImage: false
fromversion: 5.0.0
defaultmapperin: Elasticsearch - Incoming Mapper
tests:
- No tests (auto formatted)
``` |
Bissetia steniellus is a moth in the family Crambidae. It was first described by the British entomologist George Hampson in 1899. It is found in India and Vietnam where it is commonly known as the Gurdaspur borer because the larvae bore their way into and feed on the stems of sugarcane.
Description
The adult Bissetia steniellus has a wingspan of . It is generally drab brown, and there are seven darker brown spots arranged between the veins on the outer margins of the forewings. The larva is creamy-white with four narrow, longitudinal, reddish stripes along the body, and an orange head.
Distribution
B. steniellus is found in India, Pakistan and Vietnam. Its range in India includes the states of Haryana, Punjab, Uttar Pradesh and Rajasthan.
Life cycle
The only known host plant for B. steniellus is the sugarcane. The female moth lays a batch of 100 to 300 eggs on the midrib of a sugarcane leaf. The larvae hatch out after about a week and make their way into the stem by drilling holes just above a node. They excavate galleries inside the stem, feeding voraciously. When the cane dries up and the crown of leaves dies in about ten days, the larvae move to a different cane in the vicinity. They feed for three to four weeks, passing through five instar stages, before pupating inside a stem. The adult moths emerge in six to twelve days. The male/female ratio varies between forty and sixty percent in the different generations. The whole life cycle lasts about five or six weeks, and there may be two or three generations in each year.
Damage
B. steniellus does varying amounts of damage to sugarcane crops, 25% of the crop sometimes being affected, with 75% damage known in severe cases. Affected plants are best destroyed and stubble cleared at the end of the season. The organochloride endrin is effective against the pest but is banned in many countries. The fly Sturmiopsis inferens is a naturally occurring parasitoid and its use as a possible biological pest control agent is being investigated. It has proved possible to rear the fly in the laboratory and release it in sugarcane plantations. In the March to June hot season, the fly mainly targets the sugarcane shoot borer (Chilo infuscatellus) and the pink borer (Sesamia inferens). In the monsoon period, July to October, it shifts to B. steniellus. Finally, between November and January, it targets the gold-fringed rice stemborer (Chilo auricilius).
References
Haimbachiini
Moths of Asia
Agricultural pest insects
Moths described in 1899
Taxa named by George Hampson |
Shinhan Bank Co., Ltd. () is a bank headquartered in Seoul, South Korea. Historically it was the first bank in Korea, established under the name Hanseong Bank in 1897. The bank was reestablished in 1982. It is part of the Shinhan Financial Group, along with Jeju Bank. Chohung Bank merged with Shinhan Bank on April 1, 2006.
Shinhan Bank started as a small enterprise with a capital stock of KRW 25.0 billion, 279 employees, and three branches on July 7, 1982. Today, it has transformed itself into a large bank, boasting total assets of KRW 176.9 trillion, equity capital of KRW 9.7 trillion, 10,741 employees, and 1,026 branches as of 2006. As of June 30, 2016, Shinhan Bank had total assets of , total deposits of and loans of . Shinhan Bank is the main subsidiary of Shinhan Financial Group (SFG).
History
Shinhan Bank is the descendant of Hanseong Bank, the first modern bank in Korea. It was established by Kim Jong-Han in 1897, but began operating around 1900. It was originally located in a small house with only two rooms. One room was for the president, Yi Jae-Won, and the other room was for the staff. The bank operated by borrowing money from Japanese banks at low interest rates and then loaning it out for twice the rate to the Korean market. The Bank was successful because despite lending out money at twice the rate it borrowed it at, the bank's interest rates were still far lower than what could be obtained elsewhere in Korea at that time.
In an anecdotal story the bank's first property to use as collateral on a loan happened to be a donkey. The bank staff were challenged to feed and care for their collateral as the loan was out.
In March 2013, the Financial Services Commission of South Korea said that Shinhan Bank reported that its Internet banking servers had been temporarily blocked. The South Korean government asserted a North Korean link in the March cyberattacks, which has been denied by Pyongyang.
See also
List of South Korean companies
List of banks in South Korea
Big Four (banking)
Incheon Shinhan Bank S-Birds
References
External links
Official Website
Banks of South Korea
Companies based in Seoul
Banks established in 1897
South Korean brands
1897 establishments in Korea |
Giulia Bogliolo Bruna is an Italian ethno-historian, living in France, specialist of the discovery travels at the Renaissance, of the imaginary (and the image) of the north and of the Inuit in Frencophone and Anglophone travel literature, and of the Inuit, their culture and traditional art
A member of the prestigious Paris Centre of Arctic Studies, founded and directed by Jean Malaurie, she sits on the editorial board of Internord, International Review of Arctic Studies and, for two decades, of Thule, Italian review of American Studies .
She participated in the International Congress Arctic Problems, Environment, Society and Heritage, held in Paris, March 8 to 10, 2007, at the National Museum of Natural History (Muséum national d'histoire naturelle) under the honorary chairmanship of Jean Malaurie, which unveiled the International Polar Year celebrations in France.
Giulia Bogliolo Bruna is member of the Société de Géographie (Geographical Society) of Paris, of the Société des gens de lettres (as a professional writer) as well as the Società Geografica Italiana. She is strongly committed in Human Rights promotion and knowledge transmission.
Research topics
As a specialist of the discovery travels at the Renaissance, Giulia Bogliolo Bruna investigates the dynamic process of Inuit image evolution from the time of First Encounters to the Years 60. She also studies the tight relationship between the Inuit shamanic (angakkuq) thinking and their artifacts, including the tupilaq.
As a researcher at the Centre of Arctic Studies, she published several scientific books, some of which devoted to great geo-ethnologist and writer Jean Malaurie’s personality and thinking as well as more than 100 papers in the top ranking reviews and journals.
As a specialist of modern literature and poetry (), she published several poetic harvest and, in Italy, a major essay on Herman Melville. She received for her cultural commitment an Italian President's Representative Medal.
Giulia is, in addition, a specialist of Charles de Gaulle' image.
Books
Giulia Bogliolo Bruna authored a lot of scientific books that received more than 80 academic reviews worldwide:
Herman Melville, "Profili di donne", opera a cura di Alberto Lehmann e Giulia Bogliolo Bruna, Maser (TV), Edizioni Amadeus, 1986.
André Thevet (traduction, notes, introduction), Le singolarità della Francia Antartica, préface de Frank Lestringant, Reggio Emilia, Diabasis, 1997.
Alla ricerca della quadratura del Circolo Polare : testimonianze e studi in onore di Jean Malaurie, “Il POLO”, nos 25-26, 1999 (a cura di).
Duc des Abruzzes, Expédition de l’Étoile polaire dans la Mer Arctique 1899-1900, Paris, coll. Polaires, Économica, 2004 (Préface).
Au nom de la liberté, Actes de la XVIe Journée Mondiale de la Poésie, Association "Poesia-2 Ottobre" - Mairie du 14e Arrdt. de Paris chez Yvelinédition, 2005, (sous la direction).
Thule, Rivista italiana di Studi Americanistici 16-17 Regards croisés sur l’objet ethnographique : autour des arts premiers, 2006 (sous la direction de).
Amarcord, je me souviens, Actes de la XVIIe Journée Mondiale de la Poésie, Association "Poesia-2 Ottobre" de Paris- Centre d'Information et d'Études sur les Migrations Internationales chez Yvelinédition, Montigny-le-Bretonneux, 2006, (sous la direction de).
Apparences trompeuses Sananguaq. Au cœur de la pensée inuit (préface Jean Malaurie, post-face Romolo Santoni), Latitude humaine, Yvelinédition, 2007.
Femme, l’autre moitié du ciel, Actes de la XVIIIe Journée mondiale de la poésie, préface de Danièle Pourtaud, Éditions d'en Face, Paris, 2007, (sous la direction de).
Jean Malaurie, une énergie créatrice, collection Lire et Comprendre, Editions Armand Colin, Paris, octobre 2012.
Les objets messagers de la pensée inuit, collection Ethiques de la création, préface de Jean Malaurie, postface de Sylvie Dallet, Editions L'Harmattan / Institut Charles Cros, septembre 2015.
Equilibri Artici. L'umanesimo ecologico di Jean Malaurie, prefazione di Anna Casella Paltrinieri, postfazione di Luisa Faldini, Roma, Edizioni CISU, collana "Ethno-grafie americane", settembre 2016.
Terra Madre. In omaggio all'immaginario della nazione inuit, di Jean Malaurie, traduzione e prefazione di Giulia Bogliolo Bruna, Milano, EDUCatt, 2017.
Among her academic papers
Une sauvage si sauvage: une esquimaude qui n’en était pas une…. Geostorie. Bollettino e Notiziario del Centro Italiano per gli Studi Storico-Geografici, 29(2), 2021, pp. 77–105.
La vox Eskimaux dans l’Encyclopédie de Diderot et d’Alembert: archéologie d’une dissonance cognitive. Libri, atti e raccolte di saggi, 2021, pp. 175–194.
Mysterium fascinans et tremendum': la conturbante e vorace tupinambá nelle singolarità della Francia antartica'del cosmografo André Thevet. Visioni LatinoAmericane, 18(2018), Supplemento al Numero 18. Brasile-Italia: andata e ritorno. Storia, cultura, società. Confronti interdisciplinari", Trieste, EUT Edizioni Università di Trieste, 2018, pp. 292–313.
Disclosing an Unknown Source of the Eskimo Entry of Diderot & d’Alembert's Encyclopédie. Journal of Literature and Art Studies, 9(11), 1139–1148.
Shamanism Influence in Inuit Art-Dorset Period, Journal of Literature and Art Studies, 2015, Vol. 5, N°4, p. 271-281.
Esquimaux des Lumières”: archéologie d’un regard entravé revue ANUAC,Vol.3, N°1, 2014, pp. 1–19].
« Des races monstrueuses aux peuples maudits, des préadamites aux Homines religiosi: l’image des Esquimaux dans la littérature de voyage (XVIe siècle-première moitié du XVIIIE siècle) », Internord, Revue Internationale d’Études Arctiques, n°21, Editions du Centre National de la Recherche Scientifique (C.N.R.S), 2011, pp. 167–188.
« L’œuvre internationale du Centre d’Études Arctique », Internord, Revue Internationale d’Études Arctiques, n°21, Editions du Centre National de la Recherche Scientifique (C.N.R.S.), 2011, pp. 315–320.
«L’immagine dell’Altro... », in Geostorie, Anno 18 - n.1-2 - gennaio -agosto 2010, pp. 193– 204.
«Intervista a Jeorge Estevez, Taino. A lui non piace il sapote verde» in Sfumature di rosso. In Territorio Indiano con i Primi Americani a cura di Naila Clerici, Moncalieri, SOCONAS INCOMINDIOS, 2011, pp. 59–64.
« Intermondes: jeu d’identités et réappropriation des racines. Un Inuit de la toundra à la guerre de Corée. », XXXII International Americanistic Studies Congress, Perugia, 2011.
« Rêver l’Arctique et l’enseigner : les ambigüités de la planche Rossignol« Climat Froid » à l’heure de la décolonisation », XXXI International Americanistic Studies Congress, Perugia, 2010.
"Intermondi. Il richiamo del Sacro", in GEOSTORIE, Bollettino e Notiziario del CentroItaliano per gli Studi Storico-Geografici, gennaio-aprile 2009, p. 119-123.
"“Préadamites, Juifs errants, Tartares? Des "origines" des Esquimaux d´après les sources documentaires et littéraires des XVIe-XVIIIe siècle, International Americanistic Studies Congress, Perugia, 2008, p. 987-996.
"De la merveille à la curiosité. La perception du “Théâtre due la Nature Universelle" chez les voyageurs, marchands et savants de la Renaissance", Naturalia, mirabilia & monstruosa en el mundo iberico, siglos XVI-XIX : VI Coloquio International Mediadores Culturales, Lovania y Amberes, 2007, p. 1-29.
"Explorer les cartes, les textes et les images : en quête de pygmées arctiques et d’homme-poissons. Prolégomènes à la première rencontre", in Séminaire de Jean Malaurie (sous la direction de Jean Malaurie, Dominique Sewane (coord.), De la vérité in ethnologie, Paris, Economica, Collection Polaires, 2002, p. 79-96 ;
"Mestizaje de técnicas practicas y conocimientos en los inuit del Grand Norte de Canada y Groenlandia (siglos XVI-XIX)", Eduardo França Paiva, Carla Maria Junho Anastasia organizadores, O Trabalho mestiço. Maneiras de Pensar e Formas de Viver Séculos XVI a XIX, São Paulo, Annablume, PPGH/UFMG, 2002 ;
"Pigmei, Ciclopi ed Antropofagi del Grande Nord: le ambiguità di uno sguardo preformato (sec.XVI-XVIII)", XXIV International Americanistic Studies Congress, Perugia 10, 11, 12 May 2002/ São Paulo, Brasile 6, 7, 8 August 2002, Centro Studi Americanistici Circolo Amerindiano / ARGO, p. 79-86 ;
" Du mythe à la réalité: l’image des Esquimaux dans la littérature de voyage (XVI-XVIIIe siècles)", in Commission of History of International Relations, Oslo 2000 Commémorative Volume, Papers for the XIX International Congress of Historical Sciences, Oslo, 2000 ;
"Quando l’anernira si fa canto e memoria…", in Rivista di Studi Canadesi Canadian Studies Review Suppl. n. 13 Anno 2000, p. 16-27 ;
"Passer les frontières: les Inuit du Labrador (fin du XVIe-première moitié du XVIIIe)", in Rui Manuel Loureiro et Serge Gruzinski, Passar as fronteiras, II Colóquio International sobre Mediadores Culturais, Séculos XV a XVIII, Lagos, 1999, p. 1-110 ;
"Paese degli Iperborei, Ultima Thule, Paradiso Terrestre. Lo spazio boreale come altrove transgeografico ed escatologico dall’Antichità a Mercatore" in Columbeis VI, Genova, D.AR.FI.CL.ET., 1997 (Stefano Pittaluga Ed.), p. 161-178 ;
"Dalla descrizione testuale all’immagine grafica: l’inquietante tupinambá, la bella “floridienne” e la morigerata eschimese: l’alterità al femminile, ovvero lo “choc” esotico, sensuale e simbolico della Scoperta nella letteratura odeporica del XVI secolo" in: Atti del XXVI Congresso Geografico Italieno, (Genova, 4-9 mai 1992), Ed. Claudio Cerreti, Roma, Istituto dell’Enciclopedia Italiana Giovanni Treccani, 1996, p. 767-777 ;
"De Gaulle dans les manuels italiens de la Scuola Media Inferiore" p. 9-33 in De Gaulle enseigné dans le monde, Table ronde organisée au Sénat par la Fondation Charles de Gaulle le 22 mai 1995, Paris, Cahier de la Fondation Charles de Gaulle, no 2, 1995 ;
"I resoconti dell’Accademia delle Scienze di Parigi sull’attività geocartografica di Agostino Codazzi in Venezuela Cinquecento in Miscellanea di Storia delle Esplorazioni XX", Genova, Bozzi, 1995, p. 235-236 ;
"Singularitez, testimonianza etnografica, allegoria : l’immagine degli Eschimesi nell’iconografia del Rinascimento in Atti Convegno Messina 14-15 ottobre 1993 Esplorazioni geografiche e immagine del mondo nei secoli XV et XVI", - a cura di Simonetta Ballo Alagna-, p. 255-267 ;
"La cultura materiale degli Eschimesi del Labrador in alcune fonti documentarie della Nouvelle France (1650-1750)", in IL POLO, vol. 1 marzo 1994, p. 19-28 ;
"Visbooc : pesci, mostri marini e « Savages of the northe ». una testimonianza sugli Eschimesi in Europa nella seconda metà del Cinquecento" in Miscellanea di Storia delle Esplorazioni XIX, Genova, Bozzi, 1994, p. 55-66 ;
"Singularitez, testimonianza etnografica, allegoria : l’immagine degli Eschimesi nell’iconografia del Rinascimento", in Proceedings of the Messian Congress (14-15 October 1993), Esplorazioni geografiche e immagine del mondo nei secoli XV et XVI, i Simonetta Ballo Alagna (Ed.), p. 255-267 ;
"Journal de Louis Jolliet allant à la descouverte de Labrador, païs des Esquimaux », 1694. La prima fonte etnostorica sugli Inuit del Labrador" in Atti del V Convegno internazionale di studi dell’Associazione per il Medioevo e l’Umanesimo Latini, Relazioni di viaggio e conoscenza del mondo fra Medioevo e Umanesimo, Genova, 12-15 dicembre 1991, Stefano Pittaluga (Ed.) Genova, Dipartimento di Archeologia, Filologia Classica, 1993 p. 591-615 ;
"Premiers regards des Occidentaux sur les Inuit" in DESTINS CROISES Cinq siècles de rencontres avec les Amérindiens, Paris, UNESCO Bibliothèque Albin Michel, 1992, p. 393-410 ;
Colonizzazione ed etnocidio nella “Description géographique” di Nicolas Denys in Columbeis IV, Gênes, D.AR.FI.CL.ET., 1990, p. 375-390 ;
"La place du général de Gaulle dans les livres scolaires italiens" in De Gaulle en son siècle, tome 1, Dans la mémoire des hommes et des hommes et des peuples, Actes des Journées internationales tenues à l’UNESCO, Paris, 19-24 novembre 1990, Paris, Institut Charles de Gaulle-Plon- La Documentation française, 1990, 369-382 ;
Dalla realtà al mito: “Les Nouveaux Voyages aux Indes Occidentales” di M. Jean-Bernard Bossu in Miscellanea di Storia delle Esplorazioni XIII, Genova, Bozzi, 1988, p. 165-192 ;
Tracce di conoscenze cartografiche presso alcune tribù indiane del Nordamerica nella prima metà del sec. XVIII, in Miscellanea di Storia delle Esplorazioni XIII, Genova, Bozzi, 1988, p. 149- 164 ;
Amazzoni o cannibali, vergini o madri, sante o prostitute... in Columbeis III, Genova, D.AR.FI.CL.ET., 1988, p. 215-264 ;
Dalla realtà al mito: “Les Nouveaux Voyages aux Indes Occidentales” di M. Jean-Bernard Bossu, in Miscellanea di Storia delle Esplorazioni XIII, Genova, Bozzi, 1988 pp. 165–192 ;
Tracce di conoscenze cartografiche presso alcune tribù indiane del Nordamerica nella prima metà del sec. XVIII in Miscellanea di Storia delle Esplorazioni XIII, Genova, Bozzi, 1988, p. 149- 164 ;
Uno sguardo protoetnografico sull’amerindio: “Les Nouveaux Voyages aux Indes Occidentales” di M. Jean-Bernard Bossu, Miscellanea di Storia delle Esplorazioni XII, Genova, Bozzi, 1987, p. 93- 117 ;
All’insegna del “soggettivo”: la poliedrica visione dell’omosessuale nella popolazioni “primitive” del Nuovo Mondo, come effetto di pregiudizio da parte di relatori e memorialisti in Miscellanea di Storia delle Esplorazioni XII, Genova, Bozzi, 1987, p. 73-92;
La relazione sulla baia di Hudson di “Monsieur Jérémie” Miscellanea di Storia delle Esplorazioni XI, Genova, Bozzi, 1986, p. 39-70
Una fonte inedita sulla tratta degli schiavi nel Madagascar precoloniale: le “Lettres Madagascaroises” di M. De Valgny in Miscellanea di Storia delle Esplorazioni X, Genova, Bozzi, 1985, p. 147-170 ;
Il Madagascar in una lettera di M. De Barry all’Accademia reale delle Scienze di Parigi, in Miscellanea di Storia delle Esplorazioni IX, Genova, Bozzi, 1978, p. 53-70 ;
Alcune lettere dalla Cina dell’agostiniano Sigismondo Meynardi da San Nicola Miscellanea di Storia delle Esplorazioni III, Genova, Bozzi, 1978, p. 127-152 ;
"Una fonte sconosciuta del Botero : l’Historia de la China di Juan Gonzalez de Mendoza", in Miscellanea di Storia delle Esplorazioni II, Genova, Bozzi, 1977, p. 48-78.
See also
Inuit religion
André Thévet
References
External links
International Congress "Arctic problems", chaired by Prof. Jean Malaurie
"Radio3 Mondo" moderated by the journalist Anna Mazzone, with Giulia Bogliolo Bruna, Matteo Smolizza, Patrick Agnew, Daniela Tommasini", 19/12/2012]
Exhibition catalogue "Equilibres arctiques", under the scientific director of Giulia Bogliolo Bruna, Galerie-Librairie des Éditions Caractères (director : Nicole Gdalia), 8 - 25 March 2016.
Reports on Giulia Bogliolo Bruna's book "Jean Malaurie, une énergie créatrice" de Giulia Bogliolo Bruna, Paris, Ed. Armand Colin, 2012.
Resorts on Giulia Bogliolo Bruna's book "Les objets messagers de la pensée inuit" de Giulia Bogliolo Bruna, preface Jean Malaurie, postface Sylvie Dallet, Paris, Ed. Harmattan, 2015.
Radio Show "Radio3Mondo" moderated by Azzurra Meringolo, with : Giulia Bogliolo Bruna, 30/04/2015, radio3.rai.
Radio Show "Nuovi Equilibri Artici" Radio3Mondo moderated by Anna Mazzone & Azzurra Meringolo, with Giulia Bogliolo Bruna, 26/10/2016, radio3.rai.it
Italian literary historians
Italian women historians
20th-century Italian historians
20th-century Italian women writers
21st-century Italian historians
21st-century Italian women writers
Italian expatriates in France
Year of birth missing (living people)
Living people |
```shell
Managing branches
Workflow: topic branches
Setting the upstream branch
Pulling a remote branch
View your commit history in a graph
``` |
```css
`currentColor` improves code reusability
Use `box-sizing` to define an element's `width` and `height` properties
Hide the scrollbar in webkit browser
Disclose file format of links
Debug with `*` selector
``` |
Hoffleit is a surname. Notable people with the surname include:
Dorrit Hoffleit (1907-2007), American astronomer
Renate Hoffleit (born 1950), German sculptor and artist |
Stepping stone(s) may refer to:
Stepping stones, stones placed to allow pedestrians to cross a watercourse
Places
Stepping Stone, Virginia, US, an unincorporated community
Stepping Stones (islands), Antarctic and sub-Antarctic
Buildings
Stepping Stones (house), of Bill and Lois Wilson of Alcoholics Anonymous, in Bedford Hills, New York, US
Stepping Stones Light, a lighthouse on Long Island Sound, New York, US
Stepping Stones Museum for Children, Norwalk, Connecticut, US
"Stepping Stones", home of Jacques Futrelle in Scituate, Massachusetts, US
Film and theatre
The Stepping Stone, a 1916 American silent film
Stepping Stones (film), a 1931 British musical
Stepping Stones (musical), a 1923 Broadway musical
Music
Albums
Stepping Stone (album) or the title song (see below), by Lari White, 1998
Stepping Stones (album), by Wendy Matthews, 1999
Stepping Stones: Live at the Village Vanguard, by Woody Shaw, 1979
Songs
"Stepping Stone" (Duffy song), 2008
"Stepping Stone" (Eminem song), 2018
"Stepping Stone" (Jimi Hendrix song), 1970
"Stepping Stone" (Lari White song), 1998
"(I'm Not Your) Steppin' Stone", a song written by Tommy Boyce and Bobby Hart, 1966; recorded by many performers
"Stepping Stone", by Argent from Argent, 1970
"Stepping Stone", by AM
"Steppin' Stone", by Black Label Society from Hangover Music Vol. VI, 2004
"Stepping Stone", by Clannad from Sirius, 1987
"Stepping Stone", by Natasha Bedingfield from N.B., 2007
"Stepping Stones", by Bert Jansch and John Renbourn from Bert and John, 1966
"Stepping Stones", by the Headboys, 1979
"Stepping Stones", by Johnny Harris, 1970
Schools
Stepping Stone Educational Centre, Port Harcourt, Rivers State, Nigeria
Stepping Stone Model School, Alipurduar, West Bengal, India
Other uses
Stepping Stone Purse, an American horse race
Stepping-stone squeeze, a contract bridge technique
Stepping Stones: Interviews with Seamus Heaney, a 2008 book by Dennis O'Driscoll
Stepping Stones, a 1977 UK political report by John Hoskyns and Norman S. Strauss
See also
Island hopping (disambiguation)
Stepstone (disambiguation) |
```smalltalk
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Threading.Tasks;
using Certify.Config;
using Certify.Models;
using Certify.Models.API;
using Certify.Models.Config;
using Certify.Models.Config.Migration;
using Certify.Models.Providers;
using Certify.Providers;
using Certify.Shared;
namespace Certify.Management
{
public interface ICertifyManager
{
Task Init();
void SetStatusReporting(IStatusReporting statusReporting);
Task<bool> IsServerTypeAvailable(StandardServerTypes serverType);
Task<Version> GetServerTypeVersion(StandardServerTypes serverType);
Task<List<ActionStep>> RunServerDiagnostics(StandardServerTypes serverType, string siteId);
Task<ManagedCertificate> GetManagedCertificate(string id);
Task<List<ManagedCertificate>> GetManagedCertificates(ManagedCertificateFilter filter = null);
Task<ManagedCertificateSearchResult> GetManagedCertificateResults(ManagedCertificateFilter filter = null);
Task<Certify.Models.Reporting.StatusSummary> GetManagedCertificateSummary(ManagedCertificateFilter filter = null);
Task<ManagedCertificate> UpdateManagedCertificate(ManagedCertificate site);
Task DeleteManagedCertificate(string id);
Task<ImportExportPackage> PerformExport(ExportRequest exportRequest);
Task<List<ActionStep>> PerformImport(ImportRequest importRequest);
Task<List<SimpleAuthorizationChallengeItem>> GetCurrentChallengeResponses(string challengeType, string key = null);
Task<List<AccountDetails>> GetAccountRegistrations();
Task<ActionResult> AddAccount(ContactRegistration reg);
Task<ActionResult> UpdateAccountContact(string storageKey, ContactRegistration contact);
Task<ActionResult> RemoveAccount(string storageKey, bool includeAccountDeactivation = false);
Task<ActionResult<AccountDetails>> ChangeAccountKey(string storageKey, string newKeyPEM = null);
Task<List<StatusMessage>> TestChallenge(ILog log, ManagedCertificate managedCertificate, bool isPreviewMode, IProgress<RequestProgressState> progress = null);
Task<List<StatusMessage>> PerformChallengeCleanup(ILog log, ManagedCertificate managedCertificate, IProgress<RequestProgressState> progress = null);
Task<List<ActionResult>> PerformServiceDiagnostics();
Task<List<DnsZone>> GetDnsProviderZones(string providerTypeId, string credentialsId);
Task<ActionResult> UpdateCertificateAuthority(CertificateAuthority certificateAuthority);
Task<List<CertificateAuthority>> GetCertificateAuthorities();
Task<StatusMessage> RevokeCertificate(ILog log, ManagedCertificate managedCertificate);
Task<CertificateRequestResult> PerformDummyCertificateRequest(ManagedCertificate managedCertificate, IProgress<RequestProgressState> progress = null);
Task<ActionResult> RemoveCertificateAuthority(string id);
Task<List<SiteInfo>> GetPrimaryWebSites(StandardServerTypes serverType, bool ignoreStoppedSites, string itemId = null);
Task<List<CertificateRequestResult>> RedeployManagedCertificates(ManagedCertificateFilter filter, IProgress<RequestProgressState> progress = null, bool isPreviewOnly = false, bool includeDeploymentTasks = false);
Task<CertificateRequestResult> DeployCertificate(ManagedCertificate managedCertificate, IProgress<RequestProgressState> progress = null, bool isPreviewOnly = false, bool includeDeploymentTasks = false);
Task<CertificateRequestResult> PerformCertificateRequest(ILog log, ManagedCertificate managedCertificate, IProgress<RequestProgressState> progress = null, bool resumePaused = false, bool skipRequest = false, bool failOnSkip = false, bool skipTasks = false, bool isInteractive = false, string reason = null);
Task<List<DomainOption>> GetDomainOptionsFromSite(StandardServerTypes serverType, string siteId);
Task<List<CertificateRequestResult>> PerformRenewAll(RenewalSettings settings, ConcurrentDictionary<string, Progress<RequestProgressState>> progressTrackers = null);
Task<bool> PerformRenewalTasks();
Task<bool> PerformDailyMaintenanceTasks();
Task PerformCertificateCleanup();
Task<List<ActionResult>> PerformCertificateMaintenanceTasks(string managedItemId = null);
Task<List<ActionStep>> GeneratePreview(ManagedCertificate item);
void ReportProgress(IProgress<RequestProgressState> progress, RequestProgressState state, bool logThisEvent = true);
Task<List<ActionStep>> PerformDeploymentTask(ILog log, string managedCertificateId, string taskId, bool isPreviewOnly, bool skipDeferredTasks, bool forceTaskExecution);
Task<List<DeploymentProviderDefinition>> GetDeploymentProviders();
Task<List<ActionResult>> ValidateDeploymentTask(ManagedCertificate managedCertificate, DeploymentTaskConfig taskConfig);
Task<DeploymentProviderDefinition> GetDeploymentProviderDefinition(string id, DeploymentTaskConfig config);
Task<LogItem[]> GetItemLog(string id, int limit = 1000);
Task<string[]> GetServiceLog(string logType, int limit = 10000);
ICredentialsManager GetCredentialsManager();
IManagedItemStore GetManagedItemStore();
Task ApplyPreferences();
Task<List<ProviderDefinition>> GetDataStoreProviders();
Task<List<DataStoreConnection>> GetDataStores();
Task<List<ActionStep>> CopyDateStoreToTarget(string sourceId, string destId);
Task<List<ActionStep>> SetDefaultDataStore(string dataStoreId);
Task<List<ActionStep>> UpdateDataStoreConnection(DataStoreConnection dataStore);
Task<List<ActionStep>> RemoveDataStoreConnection(string dataStoreId);
Task<List<ActionStep>> TestDataStoreConnection(DataStoreConnection connection);
Task<ActionResult> TestCredentials(string storageKey);
Task<Core.Management.Access.IAccessControl> GetCurrentAccessControl();
}
}
``` |
American R&B singer-songwriter Keke Palmer has released two studio albums, three extended plays, three mixtapes and 28 singles. In 2005, she signed a record deal with Atlantic Records. Palmer released her debut album So Uncool on September 18, 2007. The album failed to chart on the US Billboard 200, but did chart at number 85 on the R&B chart. The album was preceded by the second single "Keep It Movin'". In 2010, Palmer was signed by the Chairman of Interscope Records, Jimmy Iovine, and began working on an album.
In January 2011, Palmer released her first mixtape Awaken. The mixtape was officially released on January 10, 2011, for downloading from mixtape-downloading websites. The first and only single released from the mixtape was "The One You Call". A music video was also released for the song. In July 2012, Palmer released the single "You Got Me" featuring Kevin McCall. The video for the single was released on July 11, 2012. Palmer released a self-titled mixtape Keke Palmer on October 1, 2012. It included her previously released singles "You Got Me" and "Dance Alone".
Albums
Studio albums
Mixtapes
Soundtracks
Reissue albums
Compilation albums
Extended plays
Singles
As lead artist
As featured artist
Promotional singles
Other charted songs
Other appearances
Music videos
Main artist
Guest appearances
Notes
Notes
References
Rhythm and blues discographies
Pop music discographies
Discographies of American artists |
```c++
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#include <glog/logging.h>
#include <gtest/gtest.h>
#include <memory>
#include "paddle/fluid/pir/dialect/operator/ir/op_dialect.h"
#include "paddle/fluid/pir/dialect/operator/ir/pd_op.h"
#include "paddle/fluid/pir/serialize_deserialize/include/interface.h"
#include "paddle/pir/include/core/builtin_dialect.h"
#include "paddle/pir/include/core/operation.h"
#include "paddle/pir/include/core/program.h"
TEST(SaveTest, uncompelteted_parameter) {
pir::IrContext* ctx = pir::IrContext::Instance();
ctx->GetOrRegisterDialect<paddle::dialect::OperatorDialect>();
ctx->GetOrRegisterDialect<pir::BuiltinDialect>();
pir::Program program(ctx);
pir::Builder builder = pir::Builder(ctx, program.block());
paddle::dialect::FullOp full_op1 =
builder.Build<paddle::dialect::FullOp>(std::vector<int64_t>{64, 64}, 1.5);
pir::OpInfo op_info = ctx->GetRegisteredOpInfo(pir::ParameterOp::name());
std::vector<pir::Value> inputs;
std::vector<pir::Type> output_types;
output_types.push_back(full_op1.out().type());
pir::AttributeMap attributes;
attributes.insert(
{"parameter_name", pir::StrAttribute::get(ctx, "test_param")});
pir::Operation* op =
pir::Operation::Create(inputs, attributes, output_types, op_info);
program.block()->push_back(op);
pir::WriteModule(
program, "./test_param", /*pir_version*/ 0, true, false, true);
pir::Program new_program(ctx);
pir::ReadModule("./test_param", &new_program, /*pir_version*/ 0);
pir::Operation& new_op = new_program.block()->back();
EXPECT_EQ(new_op.attribute("is_distributed").isa<pir::ArrayAttribute>(),
true);
EXPECT_EQ(new_op.attribute("is_parameter").isa<pir::ArrayAttribute>(), true);
EXPECT_EQ(new_op.attribute("need_clip").isa<pir::ArrayAttribute>(), true);
EXPECT_EQ(new_op.attribute("parameter_name").isa<pir::StrAttribute>(), true);
EXPECT_EQ(new_op.attribute("persistable").isa<pir::ArrayAttribute>(), true);
EXPECT_EQ(new_op.attribute("stop_gradient").isa<pir::ArrayAttribute>(), true);
EXPECT_EQ(new_op.attribute("trainable").isa<pir::ArrayAttribute>(), true);
}
``` |
Richburg is a town in Chester County, South Carolina, United States. The population was 275 at the 2010 census, down from 332 at the 2000 census.
History
The Elliott House and Landsford Plantation House are listed on the National Register of Historic Places.
Geography
Richburg is located in east-central Chester County at (34.717374, -81.019635). Interstate 77 passes just west of the town, with access from Exits 62 and 65. I-77 leads north to Charlotte and south to Columbia. South Carolina Highway 9 passes through the northeast side of the town, leading west to Chester, the county seat, and east to Lancaster.
According to the United States Census Bureau, Richburg has a total area of , all of it land.
Demographics
As of the census of 2000, there were 332 people, 122 households, and 87 families residing in the town. The population density was . There were 134 housing units at an average density of . The racial makeup of the town was 23.49% White, 74.70% African American, 1.20% Asian, and 0.60% from two or more races. Hispanic or Latino of any race were 1.20% of the population.
There were 122 households, out of which 31.1% had children under the age of 18 living with them, 38.5% were married couples living together, 27.0% had a female householder with no husband present, and 27.9% were non-families. 24.6% of all households were made up of individuals, and 15.6% had someone living alone who was 65 years of age or older. The average household size was 2.72 and the average family size was 3.26.
In the town, the population was spread out, with 24.7% under the age of 18, 7.8% from 18 to 24, 21.7% from 25 to 44, 29.5% from 45 to 64, and 16.3% who were 65 years of age or older. The median age was 40 years. For every 100 females, there were 77.5 males. For every 100 females age 18 and over, there were 71.2 males.
The median income for a household in the town was $31,875, and the median income for a family was $40,000. Males had a median income of $35,893 versus $18,295 for females. The per capita income for the town was $13,048. About 9.7% of families and 13.6% of the population were below the poverty line, including 22.5% of those under age 18 and 21.4% of those age 65 or over.
Education
Richburg has a public library, a branch of the Chester County Library System.
Media
WRBK, 90.3 FM, a noncommercial station that primarily features classic oldies
Notable residents
Buck Baker, NASCAR driver
Sheldon Brown, professional football player
Marty Marion, baseball player (birthplace)
References
External links
109 Broad Street - Richburg Post Office
Towns in Chester County, South Carolina
Towns in South Carolina |
Salegentibacter sediminis is a Gram-negative, aerobic, rod-shaped and non-motile bacterium from the genus of Salegentibacter which has been isolated from sediment obtained from the coast of Weihai.
References
Flavobacteria
Bacteria described in 2018 |
```java
package home.smart.fly.animations.recyclerview.customitemdecoration.sticky;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.view.View;
import android.view.ViewGroup;
import androidx.recyclerview.widget.LinearLayoutManager;
import androidx.recyclerview.widget.RecyclerView;
import java.util.ArrayList;
import java.util.List;
/**
* Created by cpf on 2018/1/16.
*/
public class StickyItemDecoration extends RecyclerView.ItemDecoration {
/**
* itemView
*/
private View mStickyItemView;
/**
* itemView
*/
private int mStickyItemViewMarginTop;
/**
* itemView
*/
private int mStickyItemViewHeight;
/**
* view
*/
private StickyView mStickyView;
/**
* UIview
*/
private boolean mCurrentUIFindStickView;
/**
* adapter
*/
private RecyclerView.Adapter<RecyclerView.ViewHolder> mAdapter;
/**
* viewHolder
*/
private RecyclerView.ViewHolder mViewHolder;
/**
* position list
*/
private List<Integer> mStickyPositionList = new ArrayList<>();
/**
* layout manager
*/
private LinearLayoutManager mLayoutManager;
/**
* position
*/
private int mBindDataPosition = -1;
/**
* paint
*/
private Paint mPaint;
public StickyItemDecoration() {
mStickyView = new ExampleStickyView();
initPaint();
}
/**
* init paint
*/
private void initPaint() {
mPaint = new Paint();
mPaint.setAntiAlias(true);
}
@Override
public void onDrawOver(Canvas c, RecyclerView parent, RecyclerView.State state) {
super.onDrawOver(c, parent, state);
if (parent.getAdapter().getItemCount() <= 0) return;
mLayoutManager = (LinearLayoutManager) parent.getLayoutManager();
mCurrentUIFindStickView = false;
for (int m = 0, size = parent.getChildCount(); m < size; m++) {
View view = parent.getChildAt(m);
/**
* view
*/
if (mStickyView.isStickyView(view)) {
mCurrentUIFindStickView = true;
getStickyViewHolder(parent);
cacheStickyViewPosition(m);
if (view.getTop() <= 0) {
bindDataForStickyView(mLayoutManager.findFirstVisibleItemPosition(), parent.getMeasuredWidth());
} else {
if (mStickyPositionList.size() > 0) {
if (mStickyPositionList.size() == 1) {
bindDataForStickyView(mStickyPositionList.get(0), parent.getMeasuredWidth());
} else {
int currentPosition = getStickyViewPositionOfRecyclerView(m);
int indexOfCurrentPosition = mStickyPositionList.lastIndexOf(currentPosition);
if (indexOfCurrentPosition >= 1) bindDataForStickyView(mStickyPositionList.get(indexOfCurrentPosition - 1), parent.getMeasuredWidth());
}
}
}
if (view.getTop() > 0 && view.getTop() <= mStickyItemViewHeight) {
mStickyItemViewMarginTop = mStickyItemViewHeight - view.getTop();
} else {
mStickyItemViewMarginTop = 0;
View nextStickyView = getNextStickyView(parent);
if (nextStickyView != null && nextStickyView.getTop() <= mStickyItemViewHeight) {
mStickyItemViewMarginTop = mStickyItemViewHeight - nextStickyView.getTop();
}
}
drawStickyItemView(c);
break;
}
}
if (!mCurrentUIFindStickView) {
mStickyItemViewMarginTop = 0;
if (mLayoutManager.findFirstVisibleItemPosition() + parent.getChildCount() == parent.getAdapter().getItemCount() && mStickyPositionList.size() > 0) {
bindDataForStickyView(mStickyPositionList.get(mStickyPositionList.size() - 1), parent.getMeasuredWidth());
}
drawStickyItemView(c);
}
}
/**
* View
* @param parent
* @return
*/
private View getNextStickyView(RecyclerView parent) {
int num = 0;
View nextStickyView = null;
for (int m = 0, size = parent.getChildCount(); m < size; m++) {
View view = parent.getChildAt(m);
if (mStickyView.isStickyView(view)) {
nextStickyView = view;
num++;
}
if (num == 2) break;
}
return num >= 2 ? nextStickyView : null;
}
/**
* StickyView
* @param position
*/
private void bindDataForStickyView(int position, int width) {
if (mBindDataPosition == position || mViewHolder == null) return;
mBindDataPosition = position;
mAdapter.onBindViewHolder(mViewHolder, mBindDataPosition);
measureLayoutStickyItemView(width);
mStickyItemViewHeight = mViewHolder.itemView.getBottom() - mViewHolder.itemView.getTop();
}
/**
* view position
* @param m
*/
private void cacheStickyViewPosition(int m) {
int position = getStickyViewPositionOfRecyclerView(m);
if (!mStickyPositionList.contains(position)) {
mStickyPositionList.add(position);
}
}
/**
* viewRecyclerView position
* @param m
* @return
*/
private int getStickyViewPositionOfRecyclerView(int m) {
return mLayoutManager.findFirstVisibleItemPosition() + m;
}
/**
* viewHolder
* @param recyclerView
*/
private void getStickyViewHolder(RecyclerView recyclerView) {
if (mAdapter != null) return;
mAdapter = recyclerView.getAdapter();
mViewHolder = mAdapter.onCreateViewHolder(recyclerView, mStickyView.getStickViewType());
mStickyItemView = mViewHolder.itemView;
}
/**
* itemView
* @param parentWidth
*/
private void measureLayoutStickyItemView(int parentWidth) {
if (mStickyItemView == null || !mStickyItemView.isLayoutRequested()) return;
int widthSpec = View.MeasureSpec.makeMeasureSpec(parentWidth, View.MeasureSpec.EXACTLY);
int heightSpec;
ViewGroup.LayoutParams layoutParams = mStickyItemView.getLayoutParams();
if (layoutParams != null && layoutParams.height > 0) {
heightSpec = View.MeasureSpec.makeMeasureSpec(layoutParams.height, View.MeasureSpec.EXACTLY);
} else {
heightSpec = View.MeasureSpec.makeMeasureSpec(0, View.MeasureSpec.UNSPECIFIED);
}
mStickyItemView.measure(widthSpec, heightSpec);
mStickyItemView.layout(0, 0, mStickyItemView.getMeasuredWidth(), mStickyItemView.getMeasuredHeight());
}
/**
* itemView
* @param canvas
*/
private void drawStickyItemView(Canvas canvas) {
if (mStickyItemView == null) return;
int saveCount = canvas.save();
canvas.translate(0, -mStickyItemViewMarginTop);
mStickyItemView.draw(canvas);
canvas.restoreToCount(saveCount);
}
}
``` |
```viml
" Vim syntax file
" Language: mir
" Maintainer: The LLVM team, path_to_url
" Version: $Revision$
syn case match
" FIXME: MIR doesn't actually match LLVM IR. Stop including it all as a
" fallback once enough is implemented.
" See the MIR LangRef: path_to_url
unlet b:current_syntax " Unlet so that the LLVM syntax will load
runtime! syntax/llvm.vim
unlet b:current_syntax
syn match mirType /\<[sp]\d\+\>/
" Opcodes. Matching instead of listing them because individual targets can add
" these. FIXME: Maybe use some more context to make this more accurate?
syn match mirStatement /\<[A-Z][A-Za-z0-9_]*\>/
syn match mirPReg /$[-a-zA-Z$._][-a-zA-Z$._0-9]*/
if version >= 508 || !exists("did_c_syn_inits")
if version < 508
let did_c_syn_inits = 1
command -nargs=+ HiLink hi link <args>
else
command -nargs=+ HiLink hi def link <args>
endif
HiLink mirType Type
HiLink mirStatement Statement
HiLink mirPReg Identifier
delcommand HiLink
endif
let b:current_syntax = "mir"
``` |
```java
package com.hotbitmapgg.bilibili.entity.recommend;
import java.util.List;
/**
* Created by hcc on 2016/9/24 19:25
* 100332338@qq.com
* <p>
* Banner
*/
public class RecommendBannerInfo {
private int code;
private List<DataBean> data;
public int getCode() {
return code;
}
public void setCode(int code) {
this.code = code;
}
public List<DataBean> getData() {
return data;
}
public void setData(List<DataBean> data) {
this.data = data;
}
public static class DataBean {
private String title;
private String value;
private String image;
private int type;
private int weight;
private String remark;
private String hash;
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
public String getImage() {
return image;
}
public void setImage(String image) {
this.image = image;
}
public int getType() {
return type;
}
public void setType(int type) {
this.type = type;
}
public int getWeight() {
return weight;
}
public void setWeight(int weight) {
this.weight = weight;
}
public String getRemark() {
return remark;
}
public void setRemark(String remark) {
this.remark = remark;
}
public String getHash() {
return hash;
}
public void setHash(String hash) {
this.hash = hash;
}
}
}
``` |
Thomas William Coke, 3rd Earl of Leicester (20 July 1848 – 19 November 1941), known as Viscount Coke until 1909, was a British peer and soldier.
Biography
Leicester was the eldest son of Thomas Coke, 2nd Earl of Leicester, by his first wife Juliana (née Whitbread).
He was a Colonel in the 2nd Battalion of the Scots Guards and served in Egypt in 1882, and at Suakin in 1885. Having retired from the regular army, he was appointed lieutenant-colonel in command of the Norfolk Artillery Militia on 21 February 1894. Following the outbreak of the Second Boer War in late 1899, the militia regiment was embodied in May 1900, and around 100 men were sent to South Africa under the command of Lord Coke. After peace was declared in May 1902, they left Cape Town on board the in late June, and arrived at Southampton the following month. For his service in the war, he was mentioned in despatches (including the final despatch by Lord Kitchener dated 23 June 1902), and was made a Companion of the Order of St Michael and St George (CMG) in the October 1902 South African Honours list. In January 1903 he was appointed an Aide-de-Camp for Militia to the King.
He was made a Knight Grand Cross of the Royal Victorian Order (GCVO) in 1908.
Lord Leicester held the position of Lord-Lieutenant of Norfolk from 1906 to 1929. He succeeded his father to the earldom and Holkham Hall in 1909.
Personal life
Lord Leicester married the Hon. Alice Emily White, daughter of Luke White, 2nd Baron Annaly, on 26 August 1879. They had five children:
Thomas William Coke, 4th Earl of Leicester born 9 July 1880, died 21 August 1949
Lieutenant Hon. Arthur George Coke, born 6 April 1882, killed in action on 21 May 1915 whilst serving with the Royal Naval Air Service. He is commemorated on the Helles Memorial at Gallipoli. Father of Anthony Coke, 6th Earl of Leicester.
Lady Marjory Alice Coke, born 1884, died 24 December 1946
Hon. Roger Coke, AFC, born 28 December 1886, died 14 Oct 1960; an officer in the Royal Air Force.
Lady Alexandra Marie Bridget Coke, born 1891, died 1984; married 1910 David Ogilvy, 12th Earl of Airlie
Alice Coke, Countess of Leicester was later appointed Dame Commander of the Order of the British Empire. She died in 1936. Lord Leicester survived her by five years and died in November 1941, aged 93. He was succeeded the earldom by his eldest son Thomas.
References
Sources
Kidd, Charles, Williamson, David (editors). Debrett's Peerage and Baronetage (1990 edition). New York: St Martin's Press, 1990,
1848 births
1941 deaths
Military personnel from Norfolk
Thomas Coke
British Army personnel of the Anglo-Egyptian War
British Army personnel of the Mahdist War
British Army personnel of the Second Boer War
3rd Earl of Leicester
Lord-Lieutenants of Norfolk
Scots Guards officers
Knights Grand Cross of the Royal Victorian Order
Companions of the Order of St Michael and St George |
```python
`Dictionary` view objects
The fundamental `tuples`
When `range` comes in handy
`bytearray` objects
`queue`s and threads
``` |
```xml
import { getMetadataArgsStorage } from "../../globals"
import { TableMetadataArgs } from "../../metadata-args/TableMetadataArgs"
import { DiscriminatorValueMetadataArgs } from "../../metadata-args/DiscriminatorValueMetadataArgs"
/**
* Special type of the table used in the single-table inherited tables.
*/
export function ChildEntity(discriminatorValue?: any): ClassDecorator {
return function (target: Function) {
// register a table metadata
getMetadataArgsStorage().tables.push({
target: target,
type: "entity-child",
} as TableMetadataArgs)
// register discriminator value if it was provided
if (typeof discriminatorValue !== "undefined") {
getMetadataArgsStorage().discriminatorValues.push({
target: target,
value: discriminatorValue,
} as DiscriminatorValueMetadataArgs)
}
}
}
``` |
```python
import argparse
import logging
import re
import time
from contextlib import nullcontext
from operator import eq, gt, lt
from typing import Any, Type
from unittest.mock import Mock, call, patch
import freezegun
import pytest
import requests.cookies
from streamlink.options import Options
from streamlink.plugin import (
HIGH_PRIORITY,
NORMAL_PRIORITY,
Plugin,
PluginArgument,
PluginArguments,
pluginargument,
pluginmatcher,
)
# noinspection PyProtectedMember
from streamlink.plugin.plugin import (
_COOKIE_KEYS, # noqa: PLC2701
_PLUGINARGUMENT_TYPE_REGISTRY, # noqa: PLC2701
Matcher,
parse_params,
stream_weight,
)
from streamlink.session import Streamlink
class FakePlugin(Plugin):
def _get_streams(self):
pass # pragma: no cover
class RenamedPlugin(FakePlugin):
__module__ = "foo.bar.baz"
class CustomConstructorOnePlugin(FakePlugin):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
class CustomConstructorTwoPlugin(FakePlugin):
def __init__(self, session, url):
super().__init__(session, url)
class TestPlugin:
@pytest.mark.parametrize(("pluginclass", "module", "logger"), [
(Plugin, "plugin", "streamlink.plugin.plugin"),
(FakePlugin, "test_plugin", "tests.test_plugin"),
(RenamedPlugin, "baz", "foo.bar.baz"),
(CustomConstructorOnePlugin, "test_plugin", "tests.test_plugin"),
(CustomConstructorTwoPlugin, "test_plugin", "tests.test_plugin"),
])
def test_constructor(self, caplog: pytest.LogCaptureFixture, pluginclass: Type[Plugin], module: str, logger: str):
session = Mock()
with patch("streamlink.plugin.plugin.Cache") as mock_cache, \
patch.object(pluginclass, "load_cookies") as mock_load_cookies:
plugin = pluginclass(session, "path_to_url")
assert not caplog.records
assert plugin.session is session
assert plugin.url == "path_to_url"
assert plugin.module == module
assert isinstance(plugin.logger, logging.Logger)
assert plugin.logger.name == logger
assert mock_cache.call_args_list == [call(filename="plugin-cache.json", key_prefix=module)]
assert plugin.cache == mock_cache()
assert mock_load_cookies.call_args_list == [call()]
def test_constructor_options(self):
one = FakePlugin(Mock(), "path_to_url", Options({"key": "val"}))
two = FakePlugin(Mock(), "path_to_url")
assert one.get_option("key") == "val"
assert two.get_option("key") is None
one.set_option("key", "other")
assert one.get_option("key") == "other"
assert two.get_option("key") is None
class TestPluginMatcher:
# noinspection PyUnusedLocal
def test_decorator(self):
with pytest.raises(TypeError) as cm:
@pluginmatcher(re.compile(""))
class MyPlugin:
pass
assert str(cm.value) == "MyPlugin is not a Plugin"
# noinspection PyUnusedLocal
def test_named_duplicate(self):
with pytest.raises(ValueError, match=r"^A matcher named 'foo' has already been registered$"):
@pluginmatcher(re.compile("path_to_url"), name="foo")
@pluginmatcher(re.compile("path_to_url"), name="foo")
class MyPlugin(FakePlugin):
pass
def test_no_matchers(self):
class MyPlugin(FakePlugin):
pass
plugin = MyPlugin(Mock(), "path_to_url")
assert plugin.url == "path_to_url"
assert plugin.matchers is None
assert plugin.matches == []
assert plugin.matcher is None
assert plugin.match is None
def test_matchers(self):
@pluginmatcher(re.compile("foo", re.VERBOSE))
@pluginmatcher(re.compile("bar"), priority=HIGH_PRIORITY)
@pluginmatcher(re.compile("baz"), priority=HIGH_PRIORITY, name="baz")
class MyPlugin(FakePlugin):
pass
assert MyPlugin.matchers == [
Matcher(re.compile("foo", re.VERBOSE), NORMAL_PRIORITY),
Matcher(re.compile("bar"), HIGH_PRIORITY),
Matcher(re.compile("baz"), HIGH_PRIORITY, "baz"),
]
def test_url_setter(self):
@pluginmatcher(re.compile("path_to_url"))
@pluginmatcher(re.compile("path_to_url"))
@pluginmatcher(re.compile("path_to_url"))
class MyPlugin(FakePlugin):
pass
plugin = MyPlugin(Mock(), "path_to_url")
assert plugin.url == "path_to_url"
assert [m is not None for m in plugin.matches] == [True, False, False]
assert plugin.matcher is plugin.matchers[0].pattern
assert plugin.match.group(1) == "foo"
plugin.url = "path_to_url"
assert plugin.url == "path_to_url"
assert [m is not None for m in plugin.matches] == [False, True, False]
assert plugin.matcher is plugin.matchers[1].pattern
assert plugin.match.group(1) == "bar"
plugin.url = "path_to_url"
assert plugin.url == "path_to_url"
assert [m is not None for m in plugin.matches] == [False, False, True]
assert plugin.matcher is plugin.matchers[2].pattern
assert plugin.match.group(1) == "baz"
plugin.url = "path_to_url"
assert plugin.url == "path_to_url"
assert [m is not None for m in plugin.matches] == [False, False, False]
assert plugin.matcher is None
assert plugin.match is None
def test_named_matchers_and_matches(self):
@pluginmatcher(re.compile("path_to_url"), name="foo")
@pluginmatcher(re.compile("path_to_url"), name="bar")
class MyPlugin(FakePlugin):
pass
plugin = MyPlugin(Mock(), "path_to_url")
assert plugin.matchers["foo"] is plugin.matchers[0]
assert plugin.matchers["bar"] is plugin.matchers[1]
with pytest.raises(IndexError):
plugin.matchers.__getitem__(2)
with pytest.raises(KeyError):
plugin.matchers.__getitem__("baz")
assert plugin.matches["foo"] is plugin.matches[0]
assert plugin.matches["bar"] is plugin.matches[1]
assert plugin.matches["foo"] is not None
assert plugin.matches["bar"] is None
with pytest.raises(IndexError):
plugin.matches.__getitem__(2)
with pytest.raises(KeyError):
plugin.matches.__getitem__("baz")
plugin.url = "path_to_url"
assert plugin.matches["foo"] is None
assert plugin.matches["bar"] is not None
plugin.url = "path_to_url"
assert plugin.matches["foo"] is None
assert plugin.matches["bar"] is None
class TestPluginArguments:
@pluginargument("foo", dest="_foo", help="FOO")
@pluginargument("bar", dest="_bar", help="BAR")
@pluginargument("baz", dest="_baz", help="BAZ")
class DecoratedPlugin(FakePlugin):
pass
class ClassAttrPlugin(FakePlugin):
arguments = PluginArguments(
PluginArgument("foo", dest="_foo", help="FOO"),
PluginArgument("bar", dest="_bar", help="BAR"),
PluginArgument("baz", dest="_baz", help="BAZ"),
)
def test_pluginargument_type_registry(self):
assert _PLUGINARGUMENT_TYPE_REGISTRY
assert all(callable(value) for value in _PLUGINARGUMENT_TYPE_REGISTRY.values())
@pytest.mark.parametrize("pluginclass", [DecoratedPlugin, ClassAttrPlugin])
def test_arguments(self, pluginclass):
assert pluginclass.arguments is not None
assert tuple(arg.name for arg in pluginclass.arguments) == ("foo", "bar", "baz"), "Argument name"
assert tuple(arg.dest for arg in pluginclass.arguments) == ("_foo", "_bar", "_baz"), "Argument keyword"
assert tuple(arg.options.get("help") for arg in pluginclass.arguments) == ("FOO", "BAR", "BAZ"), "argparse keyword"
def test_mixed(self):
@pluginargument("qux")
class MixedPlugin(self.ClassAttrPlugin):
pass
assert tuple(arg.name for arg in MixedPlugin.arguments) == ("qux", "foo", "bar", "baz")
@pytest.mark.parametrize(("options", "args", "expected", "raises"), [
pytest.param(
{"type": "int"},
["--myplugin-foo", "123"],
123,
nullcontext(),
id="int",
),
pytest.param(
{"type": "float"},
["--myplugin-foo", "123.456"],
123.456,
nullcontext(),
id="float",
),
pytest.param(
{"type": "bool"},
["--myplugin-foo", "yes"],
True,
nullcontext(),
id="bool",
),
pytest.param(
{"type": "keyvalue"},
["--myplugin-foo", "key=value"],
("key", "value"),
nullcontext(),
id="keyvalue",
),
pytest.param(
{"type": "comma_list_filter", "type_args": (["one", "two", "four"], )},
["--myplugin-foo", "one,two,three,four"],
["one", "two", "four"],
nullcontext(),
id="comma_list_filter - args",
),
pytest.param(
{"type": "comma_list_filter", "type_kwargs": {"acceptable": ["one", "two", "four"]}},
["--myplugin-foo", "one,two,three,four"],
["one", "two", "four"],
nullcontext(),
id="comma_list_filter - kwargs",
),
pytest.param(
{"type": "hours_minutes_seconds"},
["--myplugin-foo", "1h2m3s"],
3723,
nullcontext(),
id="hours_minutes_seconds",
),
pytest.param(
{"type": "UNKNOWN"},
None,
None,
pytest.raises(TypeError),
id="UNKNOWN",
),
])
def test_type_argument_map(self, options: dict, args: list, expected: Any, raises: nullcontext):
class MyPlugin(FakePlugin):
pass
with raises:
pluginargument("foo", **options)(MyPlugin)
assert MyPlugin.arguments is not None
pluginarg = MyPlugin.arguments.get("foo")
assert pluginarg
parser = argparse.ArgumentParser()
parser.add_argument(pluginarg.argument_name("myplugin"), **pluginarg.options)
namespace = parser.parse_args(args)
assert namespace.myplugin_foo == expected
def test_decorator_typeerror(self):
with patch("builtins.repr", Mock(side_effect=lambda obj: obj.__name__)):
with pytest.raises(TypeError) as cm:
# noinspection PyUnusedLocal
@pluginargument("foo")
class Foo:
pass
assert str(cm.value) == "Foo is not a Plugin"
def test_empty(self):
assert Plugin.arguments is None
@pytest.mark.parametrize("attr", ["id", "author", "category", "title"])
def test_plugin_metadata(attr):
plugin = FakePlugin(Mock(), "path_to_url")
getter = getattr(plugin, f"get_{attr}")
assert callable(getter)
assert getattr(plugin, attr) is None
assert getter() is None
setattr(plugin, attr, " foo bar ")
assert getter() == "foo bar"
class Foo:
def __str__(self):
return " baz qux "
setattr(plugin, attr, Foo())
assert getter() == "baz qux"
class TestCookies:
@staticmethod
def create_cookie_dict(name, value, expires=None):
return dict(
version=0,
name=name,
value=value,
port=None,
domain="test.se",
path="/",
secure=False,
expires=expires,
discard=True,
comment=None,
comment_url=None,
rest={"HttpOnly": None},
rfc2109=False,
)
# TODO: py39 support end: remove explicit dummy context binding of static method
_create_cookie_dict = create_cookie_dict.__get__(object)
@pytest.fixture()
def pluginclass(self):
class MyPlugin(FakePlugin):
__module__ = "myplugin"
return MyPlugin
@pytest.fixture()
def plugincache(self, request):
with patch("streamlink.plugin.plugin.Cache") as mock_cache:
cache = mock_cache("plugin-cache.json", "myplugin")
cache.get_all.return_value = request.param
yield cache
@pytest.fixture()
def logger(self, pluginclass: Type[Plugin]):
with patch("streamlink.plugin.plugin.logging") as mock_logging:
yield mock_logging.getLogger(pluginclass.__module__)
@pytest.fixture()
def plugin(self, pluginclass: Type[Plugin], session: Streamlink, plugincache: Mock, logger: Mock):
plugin = pluginclass(session, "path_to_url")
assert plugin.cache is plugincache
assert plugin.logger is logger
return plugin
@staticmethod
def _cookie_to_dict(cookie):
r = {name: getattr(cookie, name, None) for name in _COOKIE_KEYS}
r["rest"] = getattr(cookie, "rest", getattr(cookie, "_rest", None))
return r
def _cookies_to_list(self, cookies):
return [self._cookie_to_dict(cookie) for cookie in cookies]
@pytest.mark.parametrize(
"plugincache",
[{
"__cookie:test-name1:test.se:80:/": _create_cookie_dict("test-name1", "test-value1"),
"__cookie:test-name2:test.se:80:/": _create_cookie_dict("test-name2", "test-value2"),
"unrelated": "data",
}],
indirect=True,
)
def test_load(self, session: Streamlink, plugin: Plugin, plugincache: Mock, logger: Mock):
assert self._cookies_to_list(session.http.cookies) == self._cookies_to_list([
requests.cookies.create_cookie("test-name1", "test-value1", domain="test.se"),
requests.cookies.create_cookie("test-name2", "test-value2", domain="test.se"),
])
assert logger.debug.call_args_list == [call("Restored cookies: test-name1, test-name2")]
@pytest.mark.parametrize("plugincache", [{}], indirect=True)
def test_save(self, session: Streamlink, plugin: Plugin, plugincache: Mock, logger: Mock):
cookie1 = requests.cookies.create_cookie("test-name1", "test-value1", domain="test.se")
cookie2 = requests.cookies.create_cookie("test-name2", "test-value2", domain="test.se")
session.http.cookies.set_cookie(cookie1)
session.http.cookies.set_cookie(cookie2)
plugin.save_cookies(lambda cookie: cookie.name == "test-name1", default_expires=3600)
assert plugincache.set.call_args_list == [call(
"__cookie:test-name1:test.se:80:/",
self.create_cookie_dict("test-name1", "test-value1", None),
3600,
)]
assert logger.debug.call_args_list == [call("Saved cookies: test-name1")]
@freezegun.freeze_time("1970-01-01T00:00:00Z")
@pytest.mark.parametrize("plugincache", [{}], indirect=True)
def test_save_expires(self, session: Streamlink, plugin: Plugin, plugincache: Mock):
cookie = requests.cookies.create_cookie(
"test-name",
"test-value",
domain="test.se",
expires=time.time() + 3600,
rest={"HttpOnly": None},
)
session.http.cookies.set_cookie(cookie)
plugin.save_cookies(default_expires=60)
assert plugincache.set.call_args_list == [call(
"__cookie:test-name:test.se:80:/",
self.create_cookie_dict("test-name", "test-value", 3600),
3600,
)]
@pytest.mark.parametrize(
"plugincache",
[{
"__cookie:test-name1:test.se:80:/": _create_cookie_dict("test-name1", "test-value1", None),
"__cookie:test-name2:test.se:80:/": _create_cookie_dict("test-name2", "test-value2", None),
"unrelated": "data",
}],
indirect=True,
)
def test_clear(self, session: Streamlink, plugin: Plugin, plugincache: Mock):
assert tuple(session.http.cookies.keys()) == ("test-name1", "test-name2")
plugin.clear_cookies()
assert call("__cookie:test-name1:test.se:80:/", None, 0) in plugincache.set.call_args_list
assert call("__cookie:test-name2:test.se:80:/", None, 0) in plugincache.set.call_args_list
assert len(session.http.cookies.keys()) == 0
@pytest.mark.parametrize(
"plugincache",
[{
"__cookie:test-name1:test.se:80:/": _create_cookie_dict("test-name1", "test-value1", None),
"__cookie:test-name2:test.se:80:/": _create_cookie_dict("test-name2", "test-value2", None),
"unrelated": "data",
}],
indirect=True,
)
def test_clear_filter(self, session: Streamlink, plugin: Plugin, plugincache: Mock):
assert tuple(session.http.cookies.keys()) == ("test-name1", "test-name2")
plugin.clear_cookies(lambda cookie: cookie.name == "test-name2")
assert call("__cookie:test-name1:test.se:80:/", None, 0) not in plugincache.set.call_args_list
assert call("__cookie:test-name2:test.se:80:/", None, 0) in plugincache.set.call_args_list
assert tuple(session.http.cookies.keys()) == ("test-name1",)
@pytest.mark.parametrize(("params", "expected"), [
(
None,
{},
),
(
"foo=bar",
dict(foo="bar"),
),
(
"verify=False",
dict(verify=False),
),
(
"timeout=123.45",
dict(timeout=123.45),
),
(
"verify=False params={'key': 'a value'}",
dict(verify=False, params=dict(key="a value")),
),
(
"\"conn=['B:1', 'S:authMe', 'O:1', 'NN:code:1.23', 'NS:flag:ok', 'O:0']",
dict(conn=["B:1", "S:authMe", "O:1", "NN:code:1.23", "NS:flag:ok", "O:0"]),
),
])
def test_parse_params(params, expected):
assert parse_params(params) == expected
@pytest.mark.parametrize(("weight", "expected"), [
("720p", (720, "pixels")),
("720p+", (721, "pixels")),
("720p60", (780, "pixels")),
])
def test_stream_weight_value(weight, expected):
assert stream_weight(weight) == expected
@pytest.mark.parametrize(("weight_a", "operator", "weight_b"), [
("720p+", gt, "720p"),
("720p_3000k", gt, "720p_2500k"),
("720p60_3000k", gt, "720p_3000k"),
("3000k", gt, "2500k"),
("720p", eq, "720p"),
("720p_3000k", lt, "720p+_3000k"),
# with audio
("720p+a256k", gt, "720p+a128k"),
("720p+a256k", gt, "360p+a256k"),
("720p+a128k", gt, "360p+a256k"),
])
def test_stream_weight(weight_a, weight_b, operator):
assert operator(stream_weight(weight_a), stream_weight(weight_b))
``` |
Upper Town is an unincorporated community in Mono County, California. It is located about northeast of Bridgeport, at an elevation of 8061 feet (2457 m).
Upper Town was part of a string of towns established by Freemasons, others being Middle Town and Lower Town.
References
Unincorporated communities in California
Unincorporated communities in Mono County, California |
Maltese may refer to:
Someone or something of, from, or related to Malta
Maltese alphabet
Maltese cuisine
Maltese culture
Maltese language, the Semitic language spoken by Maltese people
Maltese people, people from Malta or of Maltese descent
Animals
Maltese dog
Maltese goat
Maltese cat
Maltese tiger
Other uses
Maltese cross
Maltese (surname), a surname (including a list of people with the name)
See also
The Maltese Falcon (disambiguation)
Language and nationality disambiguation pages |
From Langley Park to Memphis is the third studio album by English pop band Prefab Sprout. It was released by Kitchenware Records on 14 March 1988. It peaked at number five on the UK Albums Chart, the highest position for any studio album released by the band. Recorded in Newcastle, London and Los Angeles, it has a more polished and commercial sound than their earlier releases, and features several guest stars including Stevie Wonder and Pete Townshend. The album's simpler songs, big productions and straight-forward cover photo reflect frontman Paddy McAloon's wish for it to be a more universal work than their more cerebral earlier work.
The album received mixed reviews upon release with several criticising the elaborate production style, while McAloon's songwriting received praise. The album's commercial performance was bolstered by the success of its single "The King of Rock 'n' Roll", which became the band's only top 10 hit on the UK Singles Chart when it peaked at No. 7. The four other singles released from the album, "Cars and Girls", "Hey Manhattan!", "Nightingales" and "The Golden Calf", failed to make the top 40.
Background and recording
After the critical and commercial success of Prefab Sprout's Thomas Dolby-produced second album, 1985's Steve McQueen, Paddy McAloon felt under pressure to deliver a worthy follow-up. McAloon resolved to quickly record and release a new album using limited production values. Titled Protest Songs, the album was recorded over two weeks in Newcastle and intended for a limited release in late 1985. However, "When Love Breaks Down", a single from Steve McQueen, became a transatlantic hit in October 1985, and Protest Songs was put on hold by CBS so as not to confuse new fans and stunt sales of Steve McQueen.
Starting work on a new follow-up to Steve McQueen in 1987, the band considered rerecording songs from Protest Songs, but decided to leave the album untouched and start anew. From Langley Park to Memphis was recorded sporadically over a year in Newcastle, London and Los Angeles. Steve McQueen producer Thomas Dolby was unable to commit to producing the entire album due to his work on the soundtrack for George Lucas's Howard the Duck, ultimately a critical and commercial flop. Instead, Dolby produced the four tracks he liked the most out of 16 demos sent to him by McAloon. McAloon produced most of the remaining tracks in collaboration with Jon Kelly, while Andy Richards took Kelly's place for "Hey Manhattan!" and "The Golden Calf" was produced by McAloon alone. McAloon did not want the album's sound to be as uniform as Steve McQueens, and initially planned to use 10 different producers. This was ultimately deemed a logistical impossibility.
The album features guest appearances from Pete Townshend, Stevie Wonder and the Andraé Crouch singers – McAloon felt the latter two's contributions proved the band's music was not exclusively British.
Composition
Musical and lyrical style
In contrast to Prefab Sprout's previous work, most of the album's songs were written on keyboard and the album's sound has been described as "sonically soft". McAloon's home recording and composing setup at the time included a Roland JX-3P, a Roland JX-10, a Yamaha DX7, an Ensoniq Mirage and a Casiotone. McAloon was most comfortable with the JX-3P for composing while a Fostex B16 was used for recording demos. He aimed to write more accessible songs than those on the band's earlier records, stating "I've realised that a good simple song is better than a half-successful complicated one." McAloon also sought to expand the band's sound to incorporate his favourite elements of popular music, including gospel music and Broadway, and to reach an audience "seduced by the overall glamour and romanticism". According to Sam Sodomsky of Pitchfork, From Langley Park to Memphis includes an eclectic mix of styles including alternative rock ("The Golden Calf"), standards ("Nightingales") and Broadway-style singalong ("Hey Manhattan!"). Several songs feature American themes, reflected in the album's title. McAloon explained in a 1988 interview that he often drew inspiration from America for his songs because "America remains an inexhaustible source of myths and the extreme."
Songs
Of the album's ten tracks, Thomas Dolby produced "The King of Rock 'n' Roll", "I Remember That", "Knock on Wood" and "The Venus Of The Soup Kitchen". "The King of Rock 'n' Roll" was written in 1985. The lyrics are written from the perspective of a washed-up singer who had a one-hit wonder in the 1950s with a novelty song featuring the chorus "Hot dog, jumping frog, Albuquerque". McAloon was aware of the song's commercial potential early on, and felt it would surprise fans used to the band’s earlier, more cerebral material. Musically, "I Remember That" is, according to Nils Johansson of NSD, a gospel ballad. McAloon considered the song's nostalgic mood a lighter lyrical theme than that of a love song, with the title phrase being "close to romanticism without actually being sloppy". He tried to sing the song with a "lightness of feeling". In a 1997 interview, McAloon named "I Remember That" "the best song I've ever written". "Knock on Wood" has been described by David Stubbs of Melody Maker as a "song about breakdown, how the man who jilts will himself be jilted, couched in a beautifully adhesive reggae lilt." "The Venus of the Soup Kitchen" closes the album. McAloon wanted the song's melody to be far-reaching and resonant, with the chorus expressing "the emotional participation of everyone listening to it". He described the song's meaning in a 1988 interview "Venus travels along the road from Langley Park to Memphis. I have imagined it full of troubled people, people who need a Venus who can cook soup for them." The song features the Andraé Crouch Singers, who recorded their contribution in Stevie Wonder's studio in Los Angeles.
Jon Kelly produced "Cars and Girls", "Enchanted", "Nightingales" and "Nancy (Let Your Hair Down for Me)". "Cars and Girls" was written in 1985, and played by the band during live appearances that year. Lyrically, the song is a comment on Bruce Springsteen's use of romantic metaphors in his songs. McAloon has denied that the song indicates a personal distaste for Springsteen, telling NME "I think a lot of his audience get into him on a patriotic level that he doesn't intend. They misinterpret him, their enjoyment of him is inaccurate, all very imperialist American. I wanted to write a song about someone who was thick white trash, listening to Springsteen, and saying 'But our lives aren't like that'." Paddy McAloon has described "Enchanted" as being about "finding something to be excited about, year after year". Thomas Dolby suggested Prince should produce the track, but the album's sound engineer David Leonard failed to find Prince at Sunset Sound Recorders to approach him. McAloon sampled the opening bass run of Glen Campbell's recording of "Wichita Lineman" for the song's bassline.
McAloon wrote "Nightingales" with Barbra Streisand – whose The Broadway Album he was engrossed by – in mind. He considered it as "the purest song" the band had recorded since "When Love Breaks Down". McAloon originally envisioned the song featuring a horn solo, but ultimately composed a complex harmonica solo and wrote a letter to Stevie Wonder asking for him to play it. Wonder hadn't heard of Prefab Sprout but nevertheless obliged, adding his own melodic lines to the song. McAloon would later describe his contribution as "so breathtakingly good and precise, even though he said himself it was quite complicated". McAloon has described "Nancy (Let Your Hair Down for Me)" as "a modern love story". The song is about a married couple who work together, with the wife being the husband's boss.
Andy Richards produced "Hey Manhattan!", a song McAloon wrote on piano. McAloon originally wanted an American, Isaac Hayes, to sing it. The proposed collaboration was quashed when Hayes' manager wanted more than was offered. The song is about an enthusiastic teenager who arrives in a big city, with the theme of dreams and ambitions. Pete Townshend provided acoustic guitar for the song during the mixing stage at his studio. McAloon was nervous about the song's production during recording, having not worked with Andy Richards before, but ultimately approved of his work. Nevertheless, he'd describe "Hey Manhattan!" as "the one song I'm dissatisfied with the way we realised it. It's pretty but it's a failure."
"The Golden Calf" was self-produced by Paddy McAloon. It is among the earliest-written songs Prefab Sprout have released, having been composed in 1977 when the band was a guitar-based trio who made what McAloon would describe as "heavy metal meeting disco". The Langley Park version felt "like doing a cover version" for McAloon due to the lapse of time, and he used a less breathy singing voice than usual on the track, something he felt Thomas Dolby would not have allowed and considered more in line with his vocals from Swoon. "The Golden Calf" has been described by Andreas Hub of Fachblatt as "a real rocker" and has garnered comparisons to the work of Pete Townshend, Marc Bolan and Del Amitri.
Release
From Langley Park to Memphis was released by Kitchenware Records on 14 March 1988. The album's title comes from a line from "The Venus of the Soup Kitchen" - "Maybe it hurts your brothers too, from Langley Park to Memphis" - a lyric about universal emotions. Langley Park is a village in County Durham near where the band originated. Memphis was chosen as it was where Elvis Presley began his career. The title has been construed as a reference to Presley's album From Memphis to Vegas / From Vegas to Memphis. The album cover, designed by Nick Knight, is a straight-forward image of the four band members, intended to reflect how the album is clearer and more direct than its predecessors. The album was the Prefab Sprout's first to chart in the top 20, entering the UK Albums Chart at number 5 and remaining in the chart for 23 weeks. It remains the band's highest-charting studio album. It was certified gold by the British Phonographic Industry in April 1988. By 1997, From Langley Park to Memphis was estimated to have sold 330,000 units in the UK.
"Cars and Girls" was released as the album's lead single but failed to reach the top 40 of the UK Singles Chart, reaching a peak of number 44 over five weeks on the chart. Speaking in 1992, McAloon described himself as "shocked and stunned" at the song not being a hit, commenting "I woke up then and I’ve never had such high expectations since." In August 1988, the band were reported to have persuaded CBS to rerelease the single but this ultimately didn't happen. The second single "The King of Rock 'n' Roll" remains Prefab Sprout's greatest success in their native UK. Their only top ten hit, the song peaked at No. 7 and spent 11 weeks on the chart. The band promoted the single with mimed performances of the song on Top Of The Pops and Wogan. McAloon would later point out the irony of a song about a one-hit wonder being his only top ten single. "Hey Manhattan!" was issued as the album's third single, reaching number 72, while its fourth "Nightingales" charted at number 78. Having been offered to American album-oriented rock radio stations by Epic in advance promotion of the album, "The Golden Calf" was the fifth and final single. It was promoted by the band with performances on the children's television programmes Going Live! and Get Fresh. It charted at number 82 in the UK. "I Remember That" was released as a single in 1993 to promote the compilation album A Life of Surprises: The Best of Prefab Sprout but failed to chart.
Critical reception
From Langley Park to Memphis received mixed reviews. Rolling Stones Peter Wilkinson described the album as "overreaching", elaborating "McAloon tries leavening disjointed talk with instrumental gimmickry. Songs built around McAloon's guitar are lost in a swirl of strings and the noodlings of no less than five engineers and four producers." Dave Rimmer of Q considered it "probably their best album yet" but found "The King of Rock 'n' Roll" "a mite irritating". He felt that "the only true duff part is the overblown imagery of "The Golden Calf"." NMEs Len Brown was not enamoured with the album's production style, calling it "sickly" and "cluttered". He considered the album "a largely bland affair", but praised "Cars and Girls" and "Nancy (Let Your Hair Down for Me)". Creems Kurt B. Riley was critical of the album, feeling that the songwriting was "done a great disservice by ill-fitting arrangements". Melody Makers David Stubbs felt it was "less strong" than Steve McQueen but "more ambitious". Vogues Barney Hoskyns commented "at least seven of its 10 songs are more accessible, more ravishingly beautiful than anything McAloon has written." Both Record Mirror and Hot Press ranked the album number 5 in their "Albums of the Year" list. Additionally, the album was included in "Albums of the Year" lists in Q, The Village Voice, Musikexpress, Spex and Rockdelux. Dave DiMartino of Billboard ranked the album his fifth favourite of the year, commenting "Paddy McAloon has seen the future of rock and roll - and has returned bearing the names of Jimmy Webb, Cole Porter and absolutely no songs about cars 'n' girls." In 1991, Melody Maker'''s Paul Lester described From Langley Park to Memphis as "a hyper-modern dazzling white pop LP that ranks alongside Dare, The Lexicon of Love and Colour by Numbers."
Among retrospective reviews, Jason Ankeny of AllMusic gave the "ambitious" album 4 stars out of 5, calling it "Prefab Sprout's spiritual journey into the heart of American culture", though he felt it paled in comparison to Steve McQueen. Writing in Italy's Ciao Magazine in 1990, Paolo Battigelli described it as a "not entirely convincing record" but added "Cars and Girls" confirmed McAloon as a composer with a rare talent, albeit one hiding himself behind allegories and tortuous references." Writing for Pitchfork upon the album's reissue in 2019, Sam Sodomsky considered the album as "catchy and complex" as its best known songs "The King of Rock 'n' Roll" and "Cars and Girls" and described the music as "colourful and hopeful and alive - everything seems to sparkle, right down to the glossy band photo on the album cover". "Nancy (Let Your Hair Down for Me)" was among the ten tracks listed in NMEs "Alternative Best of Prefab Sprout" in 1992.
Aftermath and legacy
The album's commercial success caused an uncomfortable level of recognition for Paddy McAloon, who would later recollect "I was asked for autographs, girls wanted to put their hands in my hair, touch me... ...the glamorous aspect of our music has always been for me a way of showing how we as individuals are the opposite of this glittering world." Despite demand from fans and CBS, Prefab Sprout did not tour the album as McAloon did not want to sacrifice what he described as "the best time of my life for writing", stating "I know that if I go on the road I'll just end up writing in the same way as everyone else." In interviews surrounding the album's release, McAloon alluded to two new projects he was working on – a Christmas album called Total Snow and a musical about the fictional masked vigilante Zorro called Zorro the Fox''. As of 2022, neither of these projects have materialised. Regarding "The King of Rock 'n' Roll", McAloon has described himself as "reconciled to being remembered for that song" and "aware that it's a bit like being known for "Yellow Submarine" rather than "Hey Jude"." A remastered edition of the album, overseen by Paddy and Martin McAloon, was issued by Sony Music on 27 September 2019.
Track listing
Personnel
Credits adapted from liner notes.
Prefab Sprout
Neil Conti
Martin McAloon
Paddy McAloon
Wendy Smith
Additional musicians
Thomas Dolby – keyboards (1, 3, 7, 10)
Gary Moberley – keyboards (2, 4, 5)
Paul "Wix" Wickens – keyboards (2, 4, 5, 9)
Andy Richards – keyboards (6)
Luís Jardim – percussion (2, 4, 5, 6)
Lenny Castro – percussion (10)
Stevie Wonder – harmonica (5)
Pete Townshend – acoustic guitar (6)
The Andraé Crouch Singers – vocals (3, 10)
Gavin Wright – strings lead (5, 6, 9)
Robin Smith – string arrangement, conduct (5, 9)
John Altman – string arrangement, conduct (6)
Technical personnel
Thomas Dolby – production (1, 3, 7, 10)
Jon Kelly – production (2, 4, 5, 9)
Paddy McAloon – production (2, 4, 5, 6, 8, 9)
Andy Richards – production (6)
David Leonard – mixing (1, 3, 7, 10)
Richard Moakes – mixing (2, 4, 9)
Mike Shipley – mixing (5)
Tony Philips – mixing (6)
Michael H. Brauer – mixing (8)
Tim Young – mastering
Stephen Male – design
Nick Knight – photography
Charts
Weekly
Year-end
Certifications and sales
Notes
References
External links
Die 80 größten Alben der 80er: Prefab Sprout: "From Langley Park To Memphis"
1988 albums
Prefab Sprout albums
Albums produced by Jon Kelly
Albums produced by Thomas Dolby
Kitchenware Records albums
Langley Park, County Durham |
```c
/*
* The Regents of the University of California. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that: (1) source code distributions
* retain the above copyright notice and this paragraph in its entirety, (2)
* distributions including binary code include the above copyright notice and
* this paragraph in its entirety in the documentation or other materials
* provided with the distribution, and (3) all advertising materials mentioning
* features or use of this software display the following acknowledgement:
* ``This product includes software developed by the University of California,
* Lawrence Berkeley Laboratory and its contributors.'' Neither the name of
* the University nor the names of its contributors may be used to endorse
* or promote products derived from this software without specific prior
* written permission.
* THIS SOFTWARE IS PROVIDED ``AS IS'' AND WITHOUT ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTIES OF
* MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
*/
#ifdef HAVE_CONFIG_H
#include <config.h>
#endif
#include <pcap-types.h>
#include <ctype.h>
#include <memory.h>
#include <stdio.h>
#include <string.h>
#include "pcap-int.h"
#include <pcap/namedb.h>
#ifdef HAVE_OS_PROTO_H
#include "os-proto.h"
#endif
static inline int skip_space(FILE *);
static inline int skip_line(FILE *);
/* Hex digit to integer. */
static inline u_char
xdtoi(u_char c)
{
if (isdigit(c))
return (u_char)(c - '0');
else if (islower(c))
return (u_char)(c - 'a' + 10);
else
return (u_char)(c - 'A' + 10);
}
static inline int
skip_space(FILE *f)
{
int c;
do {
c = getc(f);
} while (isspace(c) && c != '\n');
return c;
}
static inline int
skip_line(FILE *f)
{
int c;
do
c = getc(f);
while (c != '\n' && c != EOF);
return c;
}
struct pcap_etherent *
pcap_next_etherent(FILE *fp)
{
register int c, i;
u_char d;
char *bp;
size_t namesize;
static struct pcap_etherent e;
memset((char *)&e, 0, sizeof(e));
for (;;) {
/* Find addr */
c = skip_space(fp);
if (c == EOF)
return (NULL);
if (c == '\n')
continue;
/* If this is a comment, or first thing on line
cannot be Ethernet address, skip the line. */
if (!isxdigit(c)) {
c = skip_line(fp);
if (c == EOF)
return (NULL);
continue;
}
/* must be the start of an address */
for (i = 0; i < 6; i += 1) {
d = xdtoi((u_char)c);
c = getc(fp);
if (c == EOF)
return (NULL);
if (isxdigit(c)) {
d <<= 4;
d |= xdtoi((u_char)c);
c = getc(fp);
if (c == EOF)
return (NULL);
}
e.addr[i] = d;
if (c != ':')
break;
c = getc(fp);
if (c == EOF)
return (NULL);
}
/* Must be whitespace */
if (!isspace(c)) {
c = skip_line(fp);
if (c == EOF)
return (NULL);
continue;
}
c = skip_space(fp);
if (c == EOF)
return (NULL);
/* hit end of line... */
if (c == '\n')
continue;
if (c == '#') {
c = skip_line(fp);
if (c == EOF)
return (NULL);
continue;
}
/* pick up name */
bp = e.name;
/* Use 'namesize' to prevent buffer overflow. */
namesize = sizeof(e.name) - 1;
do {
*bp++ = (u_char)c;
c = getc(fp);
if (c == EOF)
return (NULL);
} while (!isspace(c) && --namesize != 0);
*bp = '\0';
/* Eat trailing junk */
if (c != '\n')
(void)skip_line(fp);
return &e;
}
}
``` |
Hamana may refer to:
Lake Hamana, Shizuoka Prefecture, Japan
Hamana District, Shizuoka, Japan
Hamana High School, Hamamatsu, Japan
, several ships
Hamana (leafhopper), a genus in subfamily Iassinae
People with the surname
, Japanese animation director
See also
Gberedou/Hamana, a region in Guinea
Tsugaru-Hamana Station, Japan |
Tor Arne Lau Henriksen (February 22, 1974 – July 23, 2007) was a Norwegian officer who was killed in action in Afghanistan.
Lieutenant Lau Henriksen was the first Norwegian soldier to be awarded the military cross. His name was the "Name of the Year" 2007 in Verdens Gang.
He joined Hærens Jegerkommando before he went to Afghanistan. During a mission there, Lau Henriksen was killed by members of the Taliban in Lowgar Province. He and other Norwegian soldiers were on a reconnaissance mission together with soldiers from the Afghan National Army, when they were attacked by Taliban members dressed as civilians. Lau Henriksen was shot in the chest and became the first soldier from Hærens Jegerkommando to be killed in action in Afghanistan.
References
Norwegian Army personnel
Norwegian military personnel killed in the War in Afghanistan (2001–2021)
2007 deaths
1974 births |
```java
/*
*
* See the CONTRIBUTORS.txt file in the distribution for a
* full listing of individual contributors.
*
* This program is free software: you can redistribute it and/or modify
* published by the Free Software Foundation, either version 3 of the
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
*
* along with this program. If not, see <path_to_url
*/
package org.openremote.manager.rules;
import groovy.lang.Binding;
import groovy.lang.GroovyShell;
import groovy.lang.Script;
import org.codehaus.groovy.control.CompilerConfiguration;
import org.jeasy.rules.api.Action;
import org.jeasy.rules.api.Condition;
import org.jeasy.rules.api.Rule;
import org.jeasy.rules.api.Rules;
import org.jeasy.rules.core.RuleBuilder;
import org.kohsuke.groovy.sandbox.GroovyValueFilter;
import org.kohsuke.groovy.sandbox.SandboxTransformer;
import org.openjdk.nashorn.api.scripting.ScriptObjectMirror;
import org.openremote.container.timer.TimerService;
import org.openremote.manager.asset.AssetStorageService;
import org.openremote.model.calendar.CalendarEvent;
import org.openremote.model.rules.*;
import org.openremote.model.rules.flow.NodeCollection;
import org.openremote.model.syslog.SyslogCategory;
import org.openremote.model.util.Pair;
import org.openremote.model.util.TextUtil;
import org.openremote.model.util.ValueUtil;
import javax.script.*;
import java.util.*;
import java.util.concurrent.Future;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;
import java.util.logging.Level;
import java.util.logging.Logger;
import static org.openremote.model.rules.RulesetStatus.*;
public class RulesetDeployment {
// TODO Finish groovy sandbox
static class GroovyDenyAllFilter extends GroovyValueFilter {
@Override
public Object filterReceiver(Object receiver) {
throw new SecurityException("Not allowed: " + receiver);
}
}
public static final int DEFAULT_RULE_PRIORITY = 1000;
// Share one JS script engine manager, it's thread-safe
static final protected ScriptEngineManager scriptEngineManager;
static final protected GroovyShell groovyShell;
static {
scriptEngineManager = new ScriptEngineManager();
/* TODO Sharing a static GroovyShell doesn't work, redeploying a ruleset which defines classes (e.g. Flight) is broken:
java.lang.RuntimeException: Error evaluating condition of rule '-Update flight facts when estimated landing time of flight asset is updated':
No signature of method: org.openremote.manager.setup.database.Script1$_run_closure2$_closure14$_closure17.doCall() is applicable for argument types: (org.openremote.manager.setup.database.Flight) values: [...]
Possible solutions: doCall(org.openremote.manager.setup.database.Flight), findAll(), findAll(), isCase(java.lang.Object), isCase(java.lang.Object)
The following classes appear as argument class and as parameter class, but are defined by different class loader:
org.openremote.manager.setup.database.Flight (defined by 'groovy.lang.GroovyClassLoader$InnerLoader@2cc34cd5' and 'groovy.lang.GroovyClassLoader$InnerLoader@1af957bc')
If one of the method suggestions matches the method you wanted to call,
then check your class loader setup.
*/
groovyShell = new GroovyShell(
new CompilerConfiguration().addCompilationCustomizers(new SandboxTransformer())
);
}
protected static final Pair<Long, Long> ALWAYS_ACTIVE = new Pair<>(0L, Long.MAX_VALUE);
protected static final Pair<Long, Long> EXPIRED = new Pair<>(0L, 0L);
final protected Ruleset ruleset;
final protected Rules rules = new Rules();
final protected AssetStorageService assetStorageService;
final protected TimerService timerService;
final protected ScheduledExecutorService executorService;
final protected Assets assetsFacade;
final protected Users usersFacade;
final protected Notifications notificationsFacade;
final protected Webhooks webhooksFacade;
final protected HistoricDatapoints historicDatapointsFacade;
final protected PredictedDatapoints predictedDatapointsFacade;
final protected List<ScheduledFuture<?>> scheduledRuleActions = Collections.synchronizedList(new ArrayList<>());
protected final Logger LOG;
protected boolean running;
protected RulesetStatus status = RulesetStatus.READY;
protected Throwable error;
protected JsonRulesBuilder jsonRulesBuilder;
protected FlowRulesBuilder flowRulesBuilder;
protected CalendarEvent validity;
protected Pair<Long, Long> nextValidity;
public RulesetDeployment(Ruleset ruleset, TimerService timerService,
AssetStorageService assetStorageService, ScheduledExecutorService executorService,
Assets assetsFacade, Users usersFacade, Notifications notificationsFacade, Webhooks webhooksFacade,
HistoricDatapoints historicDatapointsFacade, PredictedDatapoints predictedDatapointsFacade) {
this.ruleset = ruleset;
this.timerService = timerService;
this.assetStorageService = assetStorageService;
this.executorService = executorService;
this.assetsFacade = assetsFacade;
this.usersFacade = usersFacade;
this.notificationsFacade = notificationsFacade;
this.webhooksFacade = webhooksFacade;
this.historicDatapointsFacade = historicDatapointsFacade;
this.predictedDatapointsFacade = predictedDatapointsFacade;
String ruleCategory = ruleset.getClass().getSimpleName() + "-" + ruleset.getId();
LOG = SyslogCategory.getLogger(SyslogCategory.RULES, Ruleset.class.getName() + "." + ruleCategory);
}
protected void init() throws IllegalStateException {
if (ruleset.getMeta().containsKey(Ruleset.VALIDITY)) {
validity = ruleset.getValidity();
if (validity == null) {
LOG.log(Level.WARNING, "Ruleset '" + ruleset.getName() + "' has invalid validity value: " + ruleset.getMeta().get(Ruleset.VALIDITY));
status = VALIDITY_PERIOD_ERROR;
return;
}
}
if (TextUtil.isNullOrEmpty(ruleset.getRules())) {
LOG.finest("Ruleset is empty so no rules to deploy: " + ruleset.getName());
status = EMPTY;
return;
}
if (!ruleset.isEnabled()) {
LOG.finest("Ruleset is disabled: " + ruleset.getName());
status = DISABLED;
}
if (!compile()) {
LOG.log(Level.SEVERE, "Ruleset compilation error: " + ruleset.getName(), getError());
status = COMPILATION_ERROR;
}
}
public long getId() {
return ruleset.getId();
}
public String getName() {
return ruleset.getName();
}
public long getVersion() {
return ruleset.getVersion();
}
public Ruleset getRuleset() {
return ruleset;
}
public Rules getRules() {
return rules;
}
protected void updateValidity() {
Pair<Long, Long> fromTo = validity.getNextOrActiveFromTo(new Date(timerService.getCurrentTimeMillis()));
if (fromTo == null) {
nextValidity = EXPIRED;
LOG.log(Level.INFO, "Ruleset deployment '" + getName() + "' has expired");
} else {
nextValidity = fromTo;
LOG.log(Level.INFO, "Ruleset deployment '" + getName() + "' paused until: " + new Date(fromTo.key));
}
}
/**
* Returns the current or next time window in which this rule is active
* @return null if deployment has expired
*/
public Pair<Long, Long> getNextOrActiveFromTo() {
if (validity == null) {
return ALWAYS_ACTIVE;
}
if (nextValidity == EXPIRED) {
return nextValidity;
}
if (nextValidity == null || nextValidity.value <= timerService.getCurrentTimeMillis()) {
updateValidity();
}
return nextValidity;
}
public boolean compile() {
LOG.info("Compiling ruleset deployment: " + ruleset);
if (error != null) {
return false;
}
switch (ruleset.getLang()) {
case JAVASCRIPT:
return compileRulesJavascript(ruleset, assetsFacade, usersFacade, notificationsFacade, historicDatapointsFacade, predictedDatapointsFacade);
case GROOVY:
return compileRulesGroovy(ruleset, assetsFacade, usersFacade, notificationsFacade, historicDatapointsFacade, predictedDatapointsFacade);
case JSON:
return compileRulesJson(ruleset);
case FLOW:
return compileRulesFlow(ruleset, assetsFacade, usersFacade, notificationsFacade, historicDatapointsFacade, predictedDatapointsFacade);
}
return false;
}
public boolean canStart() {
return status != COMPILATION_ERROR && status != DISABLED && status != RulesetStatus.EXPIRED;
}
/**
* Called when a ruleset is started (allows for initialisation tasks)
*/
public boolean start(RulesFacts facts) {
if (!canStart()) {
return false;
}
if (jsonRulesBuilder != null) {
jsonRulesBuilder.start(facts);
}
running = true;
return true;
}
/**
* Called when this deployment is stopped, could be the ruleset is being updated, removed or an error has occurred
* during execution
*/
public boolean stop(RulesFacts facts) {
if (!running) {
return false;
}
running = false;
synchronized (scheduledRuleActions) {
scheduledRuleActions.removeIf(scheduledFuture -> {
scheduledFuture.cancel(true);
return true;
});
}
if (jsonRulesBuilder != null) {
jsonRulesBuilder.stop(facts);
}
return true;
}
public void onAssetStatesChanged(RulesFacts facts, RulesEngine.AssetStateChangeEvent event) {
if (jsonRulesBuilder != null) {
jsonRulesBuilder.onAssetStatesChanged(facts, event);
}
}
protected void scheduleRuleAction(Runnable action, long delayMillis) {
ScheduledFuture<?> future = executorService.schedule(() -> {
scheduledRuleActions.removeIf(Future::isDone);
action.run();
}, delayMillis, TimeUnit.MILLISECONDS);
scheduledRuleActions.add(future);
}
protected boolean compileRulesJson(Ruleset ruleset) {
try {
jsonRulesBuilder = new JsonRulesBuilder(LOG, ruleset, timerService, assetStorageService, executorService, assetsFacade, usersFacade, notificationsFacade, webhooksFacade, historicDatapointsFacade, predictedDatapointsFacade, this::scheduleRuleAction);
for (Rule rule : jsonRulesBuilder.build()) {
LOG.finest("Registering JSON rule: " + rule.getName());
rules.register(rule);
}
return true;
} catch (Exception e) {
setError(e);
return false;
}
}
protected boolean compileRulesJavascript(Ruleset ruleset, Assets assetsFacade, Users usersFacade, Notifications notificationsFacade, HistoricDatapoints historicDatapointsFacade, PredictedDatapoints predictedDatapointsFacade) {
// TODO path_to_url
ScriptEngine scriptEngine = scriptEngineManager.getEngineByName("nashorn");
ScriptContext newContext = new SimpleScriptContext();
newContext.setBindings(scriptEngine.createBindings(), ScriptContext.ENGINE_SCOPE);
Bindings engineScope = newContext.getBindings(ScriptContext.ENGINE_SCOPE);
engineScope.put("LOG", LOG);
engineScope.put("assets", assetsFacade);
engineScope.put("users", usersFacade);
engineScope.put("notifications", notificationsFacade);
engineScope.put("historicDatapoints", historicDatapointsFacade);
engineScope.put("predictedDatapoints", predictedDatapointsFacade);
String script = ruleset.getRules();
// Default header/imports for all rules scripts
script = "load(\"nashorn:mozilla_compat.js\");\n" + // This provides importPackage
"\n" +
"importPackage(\n" +
" \"java.util.stream\",\n" +
" \"org.openremote.model.asset\",\n" +
" \"org.openremote.model.attribute\",\n" +
" \"org.openremote.model.value\",\n" +
" \"org.openremote.model.rules\",\n" +
" \"org.openremote.model.query\"\n" +
");\n" +
"var Match = Java.type(\"org.openremote.model.query.AssetQuery$Match\");\n" +
"var Operator = Java.type(\"org.openremote.model.query.AssetQuery$Operator\");\n" +
"var NumberType = Java.type(\"org.openremote.model.query.AssetQuery$NumberType\");\n" +
"var StringPredicate = Java.type(\"org.openremote.model.query.filter.StringPredicate\");\n" +
"var BooleanPredicate = Java.type(\"org.openremote.model.query.filter.BooleanPredicate\");\n" +
"var StringArrayPredicate = Java.type(\"org.openremote.model.query.filter.StringArrayPredicate\");\n" +
"var DateTimePredicate = Java.type(\"org.openremote.model.query.filter.DateTimePredicate\");\n" +
"var NumberPredicate = Java.type(\"org.openremote.model.query.filter.NumberPredicate\");\n" +
"var ParentPredicate = Java.type(\"org.openremote.model.query.filter.ParentPredicate\");\n" +
"var PathPredicate = Java.type(\"org.openremote.model.query.filter.PathPredicate\");\n" +
"var RealmPredicate = Java.type(\"org.openremote.model.query.filter.RealmPredicate\");\n" +
"var AttributePredicate = Java.type(\"org.openremote.model.query.filter.AttributePredicate\");\n" +
"var AttributeExecuteStatus = Java.type(\"org.openremote.model.attribute.AttributeExecuteStatus\");\n" +
"var EXACT = Match.EXACT;\n" +
"var BEGIN = Match.BEGIN;\n" +
"var END = Match.END;\n" +
"var CONTAINS = Match.CONTAINS;\n" +
"var EQUALS = Operator.EQUALS;\n" +
"var GREATER_THAN = Operator.GREATER_THAN;\n" +
"var GREATER_EQUALS = Operator.GREATER_EQUALS;\n" +
"var LESS_THAN = Operator.LESS_THAN;\n" +
"var LESS_EQUALS = Operator.LESS_EQUALS;\n" +
"var BETWEEN = Operator.BETWEEN;\n" +
"var REQUEST_START = AttributeExecuteStatus.REQUEST_START;\n" +
"var REQUEST_REPEATING = AttributeExecuteStatus.REQUEST_REPEATING;\n" +
"var REQUEST_CANCEL = AttributeExecuteStatus.REQUEST_CANCEL;\n" +
"var READY = AttributeExecuteStatus.READY;\n" +
"var COMPLETED = AttributeExecuteStatus.COMPLETED;\n" +
"var RUNNING = AttributeExecuteStatus.RUNNING;\n" +
"var CANCELLED = AttributeExecuteStatus.CANCELLED;\n" +
"var ERROR = AttributeExecuteStatus.ERROR;\n" +
"var DISABLED = AttributeExecuteStatus.DISABLED;\n" +
"\n"
+ script;
try {
scriptEngine.eval(script, engineScope);
compileRulesJavascript((ScriptObjectMirror) engineScope.get("rules"));
return true;
} catch (Exception e) {
setError(e);
engineScope.clear();
return false;
}
}
/**
* Marshal the JavaScript rules array into {@link Rule} instances.
*/
protected void compileRulesJavascript(ScriptObjectMirror scriptRules) {
if (scriptRules == null || !scriptRules.isArray()) {
throw new IllegalArgumentException("No 'rules' array defined in ruleset");
}
Collection<Object> rulesObjects = scriptRules.values();
for (Object rulesObject : rulesObjects) {
ScriptObjectMirror rule = (ScriptObjectMirror) rulesObject;
String name;
if (!rule.containsKey("name")) {
throw new IllegalArgumentException("Missing 'name' in rule definition");
}
try {
name = (String) rule.getMember("name");
} catch (ClassCastException ex) {
throw new IllegalArgumentException("Defined 'name' of rule is not a string");
}
String description;
try {
description = rule.containsKey("description") ? (String) rule.getMember("description") : null;
} catch (ClassCastException ex) {
throw new IllegalArgumentException("Defined 'description' is not a string in rule: " + name);
}
int priority;
try {
priority = rule.containsKey("priority") ? (int) rule.getMember("priority") : DEFAULT_RULE_PRIORITY;
} catch (ClassCastException ex) {
throw new IllegalArgumentException("Defined 'priority' is not a number in rule: " + name);
}
if (!rule.containsKey("when")) {
throw new IllegalArgumentException("Missing 'when' function in rule: " + name);
}
Condition when;
try {
ScriptObjectMirror whenMirror = (ScriptObjectMirror) rule.getMember("when");
if (!whenMirror.isFunction()) {
throw new IllegalArgumentException("Defined 'when' is not a function in rule: " + name);
}
when = whenMirror.to(Condition.class);
} catch (ClassCastException ex) {
throw new IllegalArgumentException("Defined 'when' is not a function in rule: " + name);
}
Action then;
try {
ScriptObjectMirror thenMirror = (ScriptObjectMirror) rule.getMember("then");
if (!thenMirror.isFunction()) {
throw new IllegalArgumentException("Defined 'then' is not a function in rule: " + name);
}
then = thenMirror.to(Action.class);
} catch (ClassCastException ex) {
throw new IllegalArgumentException("Defined 'then' is not a function in rule: " + name);
}
LOG.finest("Registering javascript rule: " + name);
rules.register(
new RuleBuilder().name(name).description(description).priority(priority).when(when).then(then).build()
);
}
}
protected boolean compileRulesGroovy(Ruleset ruleset, Assets assetsFacade, Users usersFacade, Notifications notificationFacade, HistoricDatapoints historicDatapointsFacade, PredictedDatapoints predictedDatapointsFacade) {
try {
// TODO Implement sandbox
// new DenyAll().register();
Script script = groovyShell.parse(ruleset.getRules());
Binding binding = new Binding();
RulesBuilder rulesBuilder = new RulesBuilder();
binding.setVariable("LOG", LOG);
binding.setVariable("rules", rulesBuilder);
binding.setVariable("assets", assetsFacade);
binding.setVariable("users", usersFacade);
binding.setVariable("notifications", notificationFacade);
binding.setVariable("historicDatapoints", historicDatapointsFacade);
binding.setVariable("predictedDatapoints", predictedDatapointsFacade);
if(ruleset instanceof RealmRuleset) {
binding.setVariable("realm", ((RealmRuleset) ruleset).getRealm());
}
if (ruleset instanceof AssetRuleset) {
binding.setVariable("assetId", ((AssetRuleset) ruleset).getAssetId());
}
script.setBinding(binding);
script.run();
for (Rule rule : rulesBuilder.build()) {
LOG.finest("Registering groovy rule: " + rule.getName());
rules.register(rule);
}
return true;
} catch (Exception e) {
setError(e);
return false;
}
}
protected boolean compileRulesFlow(Ruleset ruleset, Assets assetsFacade, Users usersFacade, Notifications notificationsFacade, HistoricDatapoints historicDatapointsFacade, PredictedDatapoints predictedDatapointsFacade) {
try {
flowRulesBuilder = new FlowRulesBuilder(LOG, timerService, assetStorageService, assetsFacade, usersFacade, notificationsFacade, historicDatapointsFacade, predictedDatapointsFacade);
NodeCollection nodeCollection = ValueUtil.JSON.readValue(ruleset.getRules(), NodeCollection.class);
flowRulesBuilder.add(nodeCollection);
for (Rule rule : flowRulesBuilder.build()) {
LOG.info("Compiling flow rule: " + rule.getName());
rules.register(rule);
}
return true;
} catch (Exception e) {
LOG.log(Level.SEVERE, "Error evaluating flow rule ruleset: " + ruleset, e);
setError(e);
return false;
}
}
public RulesetStatus getStatus() {
if (isError() || status == DISABLED) {
return status;
}
Pair<Long, Long> validity = getNextOrActiveFromTo();
if (validity == EXPIRED) {
return RulesetStatus.EXPIRED;
}
if (validity != ALWAYS_ACTIVE) {
if (validity.key > timerService.getCurrentTimeMillis()) {
return RulesetStatus.PAUSED;
}
}
if (running) {
return DEPLOYED;
}
return status;
}
public void setStatus(RulesetStatus status) {
this.status = status;
}
public Throwable getError() {
return error;
}
public void setError(Throwable error) {
this.error = error;
}
public String getErrorMessage() {
return getError() != null ? getError().getMessage() : null;
}
public boolean isError() {
return status == RulesetStatus.LOOP_ERROR || status == VALIDITY_PERIOD_ERROR || ((status == RulesetStatus.EXECUTION_ERROR || status == RulesetStatus.COMPILATION_ERROR) && !isContinueOnError());
}
public boolean isContinueOnError() {
return ruleset.isContinueOnError();
}
public boolean isTriggerOnPredictedData() {
return ruleset.isTriggerOnPredictedData();
}
@Override
public String toString() {
return getClass().getSimpleName() + "{" +
"id=" + getId() +
", name='" + getName() + '\'' +
", version=" + getVersion() +
", status=" + getStatus() +
'}';
}
}
``` |
MSR may refer to:
Science and technology
Macrophage scavenger receptor, a receptor found in macrophages
Magnetic stripe reader, a device used to read magnetic stripe cards such as credit cards
M–sigma relation, in astrophysics
Mars sample return mission, a spaceflight mission to return rock and dust samples collected on Mars
Mirror self-recognition, in animals through the mirror test
Molten salt reactor, an advanced nuclear reactor design
Computing
Machine state register, a register used in PowerPC architectures
Model-specific register, a feature in x86 processors
Microsoft Reserved Partition, a space-management partition on a computer storage device
Mining software repositories, a field that analyzes the rich data available in software repositories
Entertainment
MSR Studios, a New York recording studio
The Most Serene Republic, a Canadian indie rock band
Metropolis Street Racer, a Dreamcast racing game
Mid-season replacement, a television series that premieres in the second half of a traditional season
Metroid: Samus Returns, a video game
Mulder Scully Romance, the relationship of the main characters of the television series The X-Files
Organizations and companies
Market Street Railway (nonprofit), a historic preservation organization in San Francisco, California, U.S.
Microsoft Research, the research division of Microsoft
Mountain Safety Research, a US company specializing in outdoor equipment
Mouvement social révolutionnaire, the French fascist party Revolutionary Social Movement
Movimiento Social Republicano, the Republican Social Movement political party in Spain
MSR - The Israel Center for Medical Simulation, Israeli institute for Simulation-Based Medical Education
Places
Most Serene Republic, a title in the name of various countries
Montserrat (ISO 3166-1 code)
Transportation
EgyptAir (ICAO designator), an Egyptian airline
Main supply route, a military operations supply route
Market Street Railway (transit operator), a defunct company in California, U.S.
Maserati, a luxury sports car company
Michigan Shore Railroad, US
Muar State Railway, a defunct railway formerly operating in the Johor Sultanate, British Malaya
Mumbai Suburban Railway, India
Muş Airport (IATA airport code), Turkey
Weapons
Modern sporting rifle or AR-15 style rifle
Modular Sniper Rifle, a sniper rifle produced by Remington Arms
Magnum Sniper Rifle or Accuracy International Arctic Warfare, a bolt-action sniper rifle
Other uses
Market Stability Reserve, a part of the European Union Emission Trading Scheme
Milan–San Remo, an annual bicycle race
Mortgage servicing rights, the rights to collect mortgage payments from a borrower for the benefit of the lender
Meyer Shank Racing, an American auto racing team |
Sarvāstivāda-Vaibhāṣika () or simply Vaibhāṣika () is an ancient Buddhist tradition of Abhidharma (scholastic Buddhist philosophy), which was very influential in north India, especially Kashmir. In various texts, they referred to their tradition as Yuktavāda (the doctrine of logic), and another name for them was Hetuvāda. The Vaibhāṣika school was an influential subgroup of the larger Sarvāstivāda school. They were distinguished from other Sarvāstivāda sub-schools like the Sautrāntika by their orthodox adherence to the doctrines found in the Mahāvibhāṣa, from which their name is derived (Vaibhāṣa is a vṛddhi derivative of vibhāṣa, meaning "related to the vibhāṣa). Vaibhāṣika thought significantly influenced the Buddhist philosophy of all major Mahayana Buddhist schools of thought and also influenced the later forms of Theravāda Abhidhamma (though to a much lesser extent).
The Sarvāstivāda tradition arose in the Mauryan Empire during the second century BCE, and was possibly founded by Kātyānīputra (ca. 150 B.C.E.). During the Kushan era, the "Great Commentary" (Mahāvibhāṣa) on Abhidharma was compiled, marking the beginning of Vaibhāṣika as a proper school of thought. This tradition was well supported by Kanishka, and later spread throughout North India and Central Asia. It maintained its own canon of scriptures in Sanskrit, which included a seven-part Abhidharma Pitaka collection. Vaibhāṣika remained the most influential Buddhist school in northwest India from the first century CE until the seventh century.
Despite numerous variations and doctrinal disagreements within the tradition, most Sarvāstivāda-Vaibhāṣikas were united in their acceptance of the doctrine of "sarvāstitva" (all exists), which says that all phenomena in the three times (past, present and future) can be said to exist. Another defining Vaibhāṣika doctrine was that of simultaneous causation (sahabhū-hetu), hence their alternative name of "Hetuvāda".
Sources
Canonical texts
The main source of this tradition is the Sarvāstivāda Abhidharma Pitaka.
The texts of the Sarvāstivādin Abhidharma Pitaka are:
Sangītiparyāya ('Discourses on Gathering Together'), essentially a commentary on the Samgiti-sutra (T 9, Digha-nikaya no. 33).
Dharmaskandha ('Aggregation of Dharmas'), a list of key doctrinal topics.
Prajñāptiśāstra ('Treatise on Designations'), a list of doctrinal topics followed by question and answer sections.
Dhātukāya ('Collection of Elements'), similar to the Dhātukathā, though it uses a different doctrinal list of dharmas.
Vijñānakāya ('Collection of Consciousness'), attributed to master Devasarman. It is here that the existence of all dharmas through past, present and future, is first found.
Prakaraṇapāda ('Exposition')
Together, these comprise the Six Treatises (Chinese: 六足論; Sanskrit: षड्पादशास्त्र, ṣaḍ-pāda-śāstra). The seventh text is the Jñānaprasthāna ('Foundation of Knowledge'), also known as Aṣṭaskandha or Aṣṭagrantha, said to be composed by Kātyāyanīputra.
Yaśomitra is said to have likened this text to the body of the above six treatises, referring to them as its legs (pādas).
Exegetical texts
The Jñānaprasthāna became the basis for Sarvastivada exegetical works called vibhāṣa, which were composed in a time of intense sectarian debate among the Sarvāstivādins in Kashmir. These compendia not only contain sutra references and reasoned arguments but also contain new doctrinal categories and positions. The most influential of these was the Abhidharma Mahāvibhāṣa Śāstra ("Great Commentary"), a massive work which became the central text of the Vaibhāṣika tradition who became the Kasmiri Sarvāstivāda Orthodoxy under the patronage of the Kushan empire.
There are also two other extant vibhāṣa compendia, though there is evidence for the existence of many more of these works which are now lost. The Vibhāṣa Śāstra of Sitapani and the Abhidharma Vibhāṣa Śāstra translated by Buddhavarman c. 437 and 439 CE are the other extant Vibhasa works. Though some scholars claim the Mahāvibhāṣa dates to the reign of Kanishka during the first century CE, this dating is uncertain. However, we at least know it was translated into Chinese by the late 3rd or early 4th century CE.
Treatises
In addition to the canonical Sarvāstivādan Abhidharma, a variety of expository texts or treatises were written to serve as overviews and introductions to the Abhidharma. The best known belonging to the Sarvāstivāda tradition are:
Abhidharma-hṛdaya-sastra (The Heart of Abhidharma), by the Tocharian Dharmasresthin, circa 1st. century B.C., Bactria. It is the oldest example of a systematized Sarvāstivāda treatise.
Abhidharma-āmrtaṛasa (The Taste of the Deathless) by the Tocharian Ghoṣaka, 2nd century AD, based on the above work.
Abhidharma-hṛdaya-sastra (The Heart of Abhidharma) by Upasanta, also based on Dharmasresthin's hṛdaya-sastra.
Samyuktabhidharma-hṛdaya by Dharmatrata, also based on Dharmasresthin's hṛdaya-sastra.
Abhidharmakośa-bhāsya (Treasury of Higher Knowledge) by Vasubandhu (4th or 5th century) – a highly influential series of verses and accompanying commentary by Vasubandhu. It often critiques Vaibhāṣika views from a Sautrantika perspective. This is the main text used to study Abhidharma in Tibet and East Asia. It remains influential in Chinese and Tibetan Buddhism. However, K.L. Dhammajoti notes that this work sometimes presents the Vaibhāṣika views unfairly.
Abhidharmakośopāyikā-ṭīkā, a commentary on the Kośa by Śamathadeva
Nyāyānusāra (Conformance to Correct Principle) by Saṃghabhadra, an attempt to criticize Vasubandhu and defend orthodox Vaibhāṣika views.
Abhidharma-samaya-pradīpikā, a compendium of the above by Saṃghabhadra.
Abhidharmavatara ("Descent into the Abhidharma"), an introductory treatise by master Skandhila (5th century).
Abhidharma-dipa and its auto-commentary, the Vibhasa-prabha-vrtti, a post-Saṃghabhadra Vaibhāṣika treatise which follows closely the Abhidharmakośa verses and attempts to defend Vaibhāṣika orthodoxy.
The most mature and refined form of Vaibhāṣika philosophy can be seen in the work of master Saṃghabhadra (ca fifth century CE), "undoubtedly one of the most brilliant Abhidharma masters in India". His two main works, the *Nyāyānusāra (Shun zhengli lun 順正理論) and the *Abhidharmasamayapradīpikā (Apidamo xian zong lun 阿毘達磨顯宗論), are very important sources for late Vaibhāṣika thought. His work was referenced and cited by various important figures, such as Xuanzang and Sthiramati.
Dharmas
Dharmas and their characteristics
All Buddhist schools of Abhidharma divided up the world into "dharmas" (phenomena, factors, or "psycho-physical events"), which are the fundamental building blocks of all phenomenal experience. Unlike the sutras, the Abhidharma analyzes experience into these momentary psycho-physical processes. Dharmas refers to the discrete and impermanent instances of consciousness along with their intentional objects that rapidly arise and pass away in sequential streams. They are analogous to atoms, but are psycho-physical. Hence, according to Noa Ronkin, "all experiential events are understood as arising from the interaction of dharmas."
From the Vaibhāṣika perspective, "Abhi-dharma" refers to analyzing and understanding the nature of dharmas and the wisdom (prajñā) that arises from this. This systematic understanding of the Buddha's teaching was seen by Vaibhāṣikas as the highest expression of the Buddha's wisdom which was necessary to practice the Buddhist path. It is seen as representing the true intention of the Buddha on the level of absolute truth (paramārtha-satya). According to the Mahāvibhāṣa, "abhidharma is [precisely] the analysis of the intrinsic characteristics and common characteristics of dharmas."
For Vaibhāṣikas, dharmas are the "fundamental constituents of existence" which are discrete and real entities (dravya). K.L. Dhammajoti states:A dharma is defined as that which holds its intrinsic characteristic (svalakṣaṇadhāraṇād dharmaḥ). The intrinsic characteristic of the dharma called rūpa, for example, is the susceptibility of being molested (rūpyate), obstructability and visibility; that of another dharma called vedanā is sensation, etc. And for a dharma to be a dharma, its intrinsic characteristic must be sustainable throughout time: A rūpa remains as a rūpa irrespective of its various modalities. It can never be transformed into another different dharma (such as vedanā). Thus, a uniquely characterizable entity is a uniquely real (in the absolute sense) entity, having a unique intrinsic nature (svabhāva): “To be existent as an absolute entity is to be existent as an intrinsic characteristic (paramārthena sat svalakṣaṇena sad ityarthaṛ).”This idea is seen in the Jñānaprasthāna which states: "dharmas are determined with respect to nature and characteristic ...Dharmas are determined, without being co-mingled. They abide in their intrinsic natures, and do not relinquish their intrinsic natures (T26, 923c)."
According to Vaibhāṣikas, the svabhāvas of dharmas are those things that exist substantially (dravyasat) as opposed to those things which are made up of aggregations of dharmas and thus only have a nominal existence (prajñaptisat). This distinction is also termed the doctrine of the two truths, which holds that there is a conventional truth (saṁvṛti) that refers to things which can be further analyzed, divided or broken up into smaller constituents and an ultimate truth (paramārtha) referring to that which resists any further analysis.
Thus, a dharma's intrinsic characteristic (svalakṣaṇa) and the very ontological existence of a dharma (i.e. svabhāva, "intrinsic nature", or dravya, "substance") is one and the same. For the Vaibhāṣika school, this "own nature" (svabhāva) was said to be the characteristic of a dharma that persists through the three times (past, present and future).
Vaibhāṣika Abhidharma also describes dharmas as having "common characteristics" (sāmānya-lakṣaṇa), which applies to numerous dharmas (for example, impermanence applies to all material dharmas and all feelings, etc.). Only the mental consciousness can cognize common characteristics.
However, the intrinsic characteristics of a dharma have a certain kind of relativity due to the relationship between various dharmas. For example, all rūpa (form) dharmas have the common characteristic of resistance, but this is also an intrinsic characteristic with respect to other dharmas like vedanā (feeling).
Also, various sources state that the intrinsic nature of a dharma is "weak" and that they are interdependent with other dharmas. The Mahāvibhāṣa states that "conditioned dharmas are weak in their intrinsic nature, they can accomplish their activities only through mutual dependence" and that "they have no sovereignty (aisvarya). They are dependent on others." Thus, an intrinsic nature (svabhāva) arises due to dependently originated processes or relationships between various dharmas and therefore, a svabhāva is not something which is completely ontologically independent.
Classification of dharmas
Abhidharma thought can be seen as an attempt at providing a complete account of every type of experience. Therefore, an important part of Vaibhāṣika Abhidharma comprises the classification, definition and explanation of the different types of dharma as well as the analysis of conventional phenomena and how they arise from the aggregation of dharmas. Thus there is the element of dividing up things into their constituents as well as the element of synthesis, i.e. how dharmas combine to make up conventional things.
The Vaibhāṣikas made use of classic early Buddhist doctrinal categories such as the five skandhas, the sense bases (ayatanas) and the "eighteen dhātus". Beginning with the Pañcavastuka of Vasumitra, the Vaibhāṣikas also adopted a five group classification of dharmas which outlined a total of 75 types of phenomena.
The five main classifications of dharmas are:
Rūpa (11 dharma types), refers to matter or physical phenomena/events.
Citta (1 type), refers to thought, intentional consciousness or the bare phenomenon of consciousness. Its main characteristic is cognizing an object.
Caitasikas (46 types) refers to "thought-concomitants", mental events or "associated mentality".
Cittaviprayuktasaṃskāras (14 types) refers to "conditionings disjoined from thought" or "factors disassociated from thought". This category is unique to Vaibhāṣika and not shared with other Abhidharma schools. It groups together various experiential events that are not associated with thought but are also not physical.
Asaṃskṛta dharmas (3 types) refers to the three unconditioned dharmas: space and two states of cessation (nirodha).
Dharmas are also classified and divided into further taxonomical categories providing further aids to understanding the Buddhist view and path. Some of the major ways that the Vaibhāṣikas classified dharmas include the following:
Skillful, wholesome or useful on the path (kuśala), unskillful (akuśala) or non-defined/non-determined (avyākṛta). Skillful dharmas generate desirable and good outcomes, unskillful ones are the opposite. Non-defined dharmas are neither good nor bad.
Saṃskṛta (conditioned, fabricated), or asaṃskṛta (unconditioned). According to the Mahāvibhāṣa, a dharma is conditioned "if it has arising and ceasing, cause and effect, and acquires the characteristics of the conditioned."
Sāsrava (with āsravas, which are the "outflows" or mental impurities, a synonym for defilement) and anāsrava (without āsravas).
Darśana-heya are sāsrava dharmas abandonable by vision (into the four noble truths), bhāvanā-heya are sāsrava dharmas abandonable by cultivation of the Buddhist path, and aheya dharmas are anāsrava dharmas that are not to be abandoned.
Rūpa (matter)
Matter is that which is "subject to deterioration or disintegration." As Vasubandhu says, it is what "is repeatedly molested/broken" by contact. The main way of defining matter for Vaibhāṣikas is that it has two main distinctive natures: resistance (sa-pratighātatva), which is “the hindrance to the arising of another thing in its own location,” and visibility (sa-nidarśanatva), which allows one to locate matter since "it can be differently indicated as being here or being there" (Saṃghabhadra).
The primary material dharmas are the four Great Elements (mahābhūta, "Great Reals") — earth (pṛthivī), water (ap), fire (tejas), air (vāyu). All other dharmas are "derived matter" (upādāya-rūpa/bhautika) which arise on the basis of the Great Realities. According to Dhammajoti: "The four Great Elements exist inseparably from one another, being co-existent causes (sahabhū-hetu) one to another. Nevertheless, rūpa-dharma‑s are manifested and experienced in diverse forms because of the difference in intensity or substance of one or more of the four Elements."
Vaibhāṣika also had a theory of atoms. However, these atoms (paramāṇu) were not seen as eternally immutable or permanent and instead are seen as momentary. For Vaibhāṣika, an atom is the smallest unit of matter, which cannot be cut, broken up and has no parts. They come together (without touching each other) to form aggregations or "molecules". They held that this is "known through mental analysis."
Mind and mental factors
In Vaibhāṣika Abhidharma, the mind is a real entity, which is referred to by three mostly synonymous terms: citta, manas (thinking) and vijñāna (cognition), which are sometimes seen as different functional aspects of the mind. As defined by K.L. Dhammajoti, citta "is the general discernment or apprehension with respect to each individual object. This discernment is the mere grasping of the object itself, without apprehending any of its particularities." Saṃghabhadra defines it as what "grasps the characteristic of an object in a general manner."
Citta never arises by itself, it is always accompanied by certain mental factors or events (caittas or caitasikas), which are real and distinct dharmas that make a unique contribution to the mental process. Therefore, a moment of thought always has a specific nature and content. Cittas and caittas always arise together simultaneously in mutually dependent relationships.
The doctrine which said that these two always arise and operate together is called "conjunction" (saṃprayoga). What conjunction meant was a disputed topic among the early masters. Later, it came to be accepted that for citta and caittas to be conjoined, the following had to be true: both must be supported by the same basis (āśraya i.e. sense organ), they must have the same object (ālambana), mode of activity (ākāra), same time (kāla), and the same substance (dravya). This doctrine was repudiated by the Sautrāntika, who held that dharmas only arise successively, one after the other.
As seen in their list of dharmas, the Vaibhāṣikas classified caittas into various sub-categories based on various qualities. For example, the first classification, the universal dharmas (mahābhūmika), are so called because they exist in all types of citta. Then there are also universal good dharmas (kuśala mahābhūmikā) and universal defilements (kleśa).
One of the major controversies in Abhidharma Buddhism dealt with the question of the original nature of citta. Some, like the Mahāsāṃghika, held the view that it retains an originally pure nature. Vaibhāṣikas like Saṃghabhadra rejected this view, holding that the nature of citta can also be defiled.
Cittaviprayuktasaṃskāras
Unlike other Abhidharma schools, the Vaibhāṣikas added another ultimate classification termed citta-viprayukta-saṃskāra, “conditionings (forces) disjoined from thought.” These "are real entities which are neither mental nor material in nature, which yet can operate on both domains" and can be seen as laws of nature. Dhammajoti notes however that the Abhidharma works of other schools like the *Śāriputrābhidharma also contain this category, just not as one of the main ultimate classifications. He also notes that there was never full agreement on how many dharmas are found in this category and that the Sautrāntikas did not accept their reality. Thus it was a much debated topic in Northern Abhidharma traditions.
Perhaps the most important of these conditionings are acquisition (prāpti) and non-acquisition (aprāpti). Acquisition:Is a force that links a dharma to a particular serial continuity (santati/santāna), i.e., the individual. Non-acquisition is another real entity whose function and nature are just opposed to those of acquisition: It acts to ensure that a given dharma is delinked from the individual serial continuity...It was at a relatively later stage that acquisition came to be defined generally as the dharma that effects the relation of any dharma to a living being (santāna). These conditionings are particularly important because, due to their theory of tri-temporal existence, acquisition is central to the Vaibhāṣika understanding of defilement and purification. Since a defilement is a real dharma that exists always (sarvadā asti); it cannot be destroyed, however it can be de-linked from an individual by disrupting the acquisition-series. This also helps to explain how one can obtain a pure dharma such as nirvāṇa, since it is only through acquisition that one experiences nirvāṇa.
Another doctrinally important set of conditionings are "the four characteristics of the conditioned (saṃskṛta-lakṣaṇa)." Dharmas are said to have the production-characteristic (jāti-lakṣaṇa) which allows them to arise, the duration-characteristic (sthiti-lakṣaṇa) which is what enables it to temporarily remain and the decay-characteristic (jarā‑lakṣaṇa) which is the force which impairs its activity so that it can no longer continue projecting another distinct effect. A dharma also has the impermanence or disappearance characteristic (anityatā/vyayalakṣaṇa) which is what causes it to enter into the past.
Asaṃskṛta (the unconditioned)
Unconditioned dharmas are those which exist without being dependently co-arisen (pratītya-samutpanna), they are also not temporal or spatial. They transcend arising and ceasing, and are real existents that possess a unique efficacy (though not a temporal causal efficacy like other dharmas).
The Vaibhāṣika school taught three types of unconditioned dharmas: space (ākāśa), cessation through deliberation (pratisaṃkhyā-nirodha), and cessation independent of deliberation (apratisaṃkhyā-nirodha).
In the MVŚ, some disagreement among Sarvāstivāda masters regarding these dharmas can be seen. Some like "the Bhadanta" (Dharmatrāta) denied the reality of space. Meanwhile, Dārṣṭāntikas denied the ontological reality of all three.
According to Dhammajoti, cessation through deliberation refers to "the cessation of defilements acquired through the process of discriminative or deliberative effort." There are just as many of these cessations as there are with-outflow dharmas. Cessation independent of deliberation meanwhile "are those acquired simply on account of the deficiency in the required assemblage of conditions for the particular dharma‑s. They are so called because they are independent of any deliberative effort." There are as many of these cessations are there are conditioned dharmas.
Cessation through deliberation is also the technical term for the Buddhist goal of nirvāṇa, which is also defined as "a disjunction (visaṃyoga) from with-outflow dharma‑s acquired through the process of discrimination/deliberation (pratisaṃkhyāna) which is a specific outflow-free prajñā." Nirvāṇa is the absolute absence of karma and the defilements, the escape from the skandhas and all saṃsāric existence which attained by an arhat.
Nirvāṇa's real existence
In Sarvāstivāda, nirvāṇa is a "distinct positive entity" (dravyāntara). It is "an ontologically real force that is acquired by the practitioner when a given defilement is completely abandoned." This force ensures that the defilement's acquisition will never arise again. Master Skandhila's definition indicates how this real entity has a positive presence, which is said to be "like a dike holding back the water or a screen blocking the wind."
Vaibhāṣika holds that the real existence of nirvāṇa is supported both by direct perception and by scripture which depict the Buddha stating that "there is definitely the unborn." Sautrāntikas disagree with this interpretation of scripture, holding that the unborn simply refers to the discontinuity of birth (janmāpravṛtti), and thus it is a mere concept referring to the absence of suffering due to the abandoning of the defilements and thus it is only relatively real (prajñaptisat). However, Saṃghabhadra argues that "it is only when the unborn is conceded to be a distinct real entity that it is meaningful to say 'there is'. Besides, if there were no such entity, the Buddha should have simply said 'there is the discontinuity of the born.'"
According to Vaibhāṣika, nirvāṇa must be an ultimately real existent because no real supporting phenomena can be found which could serve as the basis on which to designate nirvāṇa as a relative existent (as the aggregates serve to designate the self as relative, for example). Also, if nirvāṇa is not a real force, then beings could not give rise to delight in nirvāṇa and disgust towards saṃsāra, for nirvāṇa would be inferior in terms of existence. It would also mean that the Buddha had been deluding everyone by speaking of non-existents in the same way that he spoke of the existents.
Furthermore, if nirvāṇa was unreal, it could not be one of the four noble truths, since a non-existent cannot be said to be true or false. An ārya is said to directly see the four truths, including the third truth of duḥkhanirodha (the end of suffering, i.e. nirvāṇa) and wisdom cannot arise with regard to a non-existent object.
Time and Ontology
Existence
The name Sarvāstivāda literally means "all exists" (sarvām asti), referring to their doctrine that all dharmas, past present and future, exist. This doctrine of tri-temporal existence has been described as an eternalist theory of time.
What does it mean for a dharma to exist? For the Sarvāstivāda Abhidharmikas, the main reasons that something is real or existent is causal efficacy and the fact that it abides in its own nature (svabhāva). The Vaibhāṣika philosopher Saṃghabhadra defines an existent as follows: "The characteristic of a real existent is that it serves as an object-domain for generating cognition (buddhi)." Each cognition is intentional and it has a distinctive character which is caused by the intrinsic characteristic (svalakṣaṇa) of the object of cognition. If there is no object of cognition (viṣaya), there is no cognition.
Furthermore, according to Saṃghabhadra, only if there are true existent forms can there be a difference between correct and incorrect cognitions regarding material things.
Saṃghabhadra further adds that they are of two types of existents:What exists truly (dravyato’sti) and what exists conceptually (prajñaptito’sti), the two being designated on the basis of conventional truth and absolute truth. If, with regard to a thing, a cognition (buddhi) is produced without depending on anything else, this thing exists truly — e.g., rūpa, vedanā, etc. If it depends on other things to produce a cognition, then it exists conceptually/relatively — e.g., a vase, army, etc.Furthermore, things that truly exist are also of two types: those things that just have their own nature and those things that have both their own nature and also have activities (kāritra). Additionally, this last type is divided into two: "with or without function (sāmarthya/vyāpara/śakti)." Lastly, relative existents are also of two types, "having existence on the basis of something real or on something relative, like a vase and an army, respectively."
Arguments in favor of temporal eternalism
According to Jan Westerhoff, one reason they had for holding this theory was that moments of consciousness are intentional (are directed, "about something") and thus if there are no past entities which exist, thoughts about them would be objectless and could not exist. Another argument is that to account for past actions (karma) which have effects at a later time. If an act of karma no longer exists, it is difficult, argues the Vaibhāṣika, to see how they can have fruits in the present or future. Finally, past, present and future are mutually interdependent ideas. If past and future are non-existent, argued the Vaibhāṣikas, how can one make sense of the existence of the present?
In the Samyukta-abhidharma-hrdaya, a fourth century Gandharan Sarvāstivāda text, the core Sarvāstivāda theory is defended thus:
Vasubandhu outlines the main arguments based on scripture and reason for all exists as follows:
a. For, it has been said by the Buddha: “O bhikṣus, if past rūpa did not exist, the learned noble disciple could not have become disgusted with regard to the past rūpa. It is because past rūpa exists that the learned noble disciple becomes disgusted with regard to the past rūpa. If future rūpa did not exist, the learned noble disciple could not have become free from delight with regard to the future rūpa. It is because future rūpa exists that…”
b. It has been said by the Buddha, “Conditioned by the two [— sense organ and the object —], there is the arising of consciousness…”
c. Consciousness arises when there is an object, not when there is no object. This is a fixed principle. If past and future [dharma‑s] were non-existent, there would be a consciousness having a non-existent object. Hence, in the absence of an object, consciousness itself would not exist.
d. If past [dharma‑s] were non-existent, how could there be in the future the fruit of pure or impure karma? For it is not the case that at the time of the arising of the fruit a present retribution-cause exists!
Temporality
Regarding time (adhvan), for Vaibhāṣikas, it is just a superimposition on the activity of these different types of dharmas and does not exist independently. Because of this, there was a need to explain how one experiences time and change. Among the different Sarvāstivāda thinkers, there were different ideas on how dharmas change so as to give rise to the experience of time. The Mahāvibhāṣa (MVŚ) speaks of four major theories which attempt to do this:
The theory which says there is a change in mode of being (bhāva-anyathātva).
The theory which says there is a change in characteristic (lakṣaṇa-anyathātva).
The theory which says there is a change in state or condition (avasthā-anyathātva).
The theory which says there is a change in [temporal] relativity (anyathā-anyathātva).
The positions are further outlined by Vasubandhu as follows:
"The Bhadanta Dharmatrata defends change in mode of being, that is, he affirms that the three time periods, past, present, and future, are differentiated by their non-identity of existence (bhava). When a dharma goes from one time period to another its nature is not modified, but its existence is."
"The Bhadanta Ghosaka defends change in characteristic, that is, the time periods differ through the difference in their characteristics. A dharma goes through the time periods. When it is past, it is endowed with past characteristics (laksana), but it is not deprived of its present and future characteristics..." [and so on with present and future]
"The Bhadanta Vasumitra defends change in state/condition, that is, the time periods differ through the difference of condition (avastha). A dharma, going through the time periods, having taken up a certain condition, becomes different through the difference of its condition, not through a difference in its substance. Example: a token placed on the square of ones, is called one; placed on the square of tens, ten; and placed on the square of hundreds, one hundred."
"The Bhadanta Buddhadeva defends change in [temporal] relativity, that is, the time periods are established through their mutual relationships. A dharma, going throughout the time periods, takes different names through different relationships, that is, it is called past, future, or present, through a relationship with what precedes and with what follows. For example, the same woman is both a daughter and a mother."
In the Abhidharmakośa, Vasubandhu argues that "the best system is that of Vasumitra". The Samyukta-abhidharma-hrdaya agrees.
Later Sarvāstivāda developed a combination of the first and third views. This can be seen in Saṃghabhadra, who argues that while a dharma's essential nature does not change, its function or activity (kāritra) and its existence (bhāva) changes:The essential nature of a dharma remains eternally; its bhāva [existence] changes: When a saṃskṛta [conditioned] dharma traverses through adhvan [time], it gives rise to its kāritra [activity] in accordance with the pratyaya-s [conditions], without abandoning its substantial nature; immediately after this, the kāritra produced ceases. Hence it is said that the svabhāva exists eternally and yet it is not permanent, since its bhāva changes.Thus, for Saṃghabhadra, "a dharma is present when it exercises its kāritra, future when its kāritra is not yet exercised, past when it has been exercised." The term kāritra is defined as "a dharma’s capability of inducing the production of its own next moment." When the right set of conditions come together, a dharma becomes endowed with activity (which vanishes in a single moment). When it does not have activity, a dharma's own nature still has the capacity to causally contribute to other dharmas.
Svabhāva in time
Regarding the essential nature (svabhāva) or reality (dravya) of a dharma, all Vaibhāṣika thinkers agreed that it is what remains constant and does not change as a dharma moves throughout the three times. However, as noted by K.L. Dhammajoti, this does not necessarily mean that a dharma's svabhāva "is immutable or even permanent, for a dharma’s mode of existence and its essential nature are not different, so that when the former is undergoing transformation, so is its svabhāva."
From the Vaibhāṣika perspective this is not a contradiction, since it is the same process that remains (even while changing) throughout time. Thus, in this particular sense, there is no change in the svabhāva or svalakṣaṇa. This is said to be the case even though a dharma is always being transformed into different modes of being. Each of these is actually a new occasion or event in a causal stream (though it is not different in terms of its nature than previous dharmas in that stream). Thus according to K.L. Dhammajoti, there is a way in which the essential natures are transformed, and yet, one can say that they remain the same ontologically. Dharmatrāta used the example of a piece of gold that is transformed into different things (cups, bowl, etc). While there are different entities, the essential nature of gold remains the same.
This perspective is expressed by Saṃghabhadra who argues that svabhāva is not permanent since it goes through time and its existence (bhāva) varies through time. Saṃghabhadra also notes that a dharma is produced by various causes (and is part of a causal web which has no beginning), and once a dharma has ceased, it does not arise again. However, for Saṃghabhadra, one can still say that dharmas do not lose their svabhāva. He uses the example of vedanās (sensation). Even though we speak of various modes of sensation, all the types of sensation in a person's mindstream have the same nature of being sensitive phenomena (prasāda rūpa). Saṃghabhadra then states:It is not the case that since the function is different from the existence, that there can be the difference in the functions of seeing, hearing, etc. Rather, the very function of seeing, etc., is none other than the existence of the eye, etc. On account of the difference in function, there is definitely the difference in the mode of existence… Since it is observed that there are dharma‑s that co-exist as essential substances and whose essential characteristics do not differ but that [nevertheless] have different modes of existence, we know that when dharma‑s traverse the three times, their modes of existence vary while their essential characteristics do not change.He also states:[Our explanations] also have properly refuted the objection that [our theory of sarvāstitva] implies the permanence of [a dharma’s] essential nature, for, while the essential nature remains always [the same], its avasthā [condition] differs [in the stages of time] since there is change. This difference of avasthā is produced on account of conditions and necessarily stays no more than one kṣaṇa [moment]. Accordingly, the essential nature of the dharma too is impermanent, since it is not distinct from the difference [that arises in it]. [But] it is only in an existent dharma that changes can obtain; there cannot be change in a non-existent. In this way, therefore, we have properly established the times.According to K.L. Dhammajoti, what the Vaibhāṣikas had in mind with this view was that even though the different dharmas in a causal series are different entities, there is an overall "individuality or integrity", and the series thus remains "dynamically identical." This is a relationship of identity-in-difference (bhedābheda). In this sense, a svabhāva is not a static entity, it is impermanent and undergoes change and yet "ontologically it never becomes a totally different substance." Saṃghabhadra claimed that it is only when understood in this way that the doctrine of "all exists" is logically compatible with the doctrine of impermanence.
Momentariness
Orthodox Sarvāstivāda also defended the theory of moments (kṣaṇavada). This doctrine held that dharmas last only for a moment, this measure of time is the smallest measure of time possible, it is described in the Samyukta-abhidharma-hrdaya as:
Theory of Causality
An important topic covered in Vaibhāṣika Abhidharma was the investigation of causes, conditions and their effects. Vaibhāṣikas used two major schemes to explain causality: the four conditions (pratyaya) and the six causes (hetu). In this system, the arising of dharmas is totally dependent on specific causes. Causal force is what makes a dharma real and thus they are also called saṃskāras (conditioning forces). Because of this, all dharmas belong to some kind of causal category, and are said to have causal efficacy. Indeed, it is only through examining their causes that the intrinsic nature manifests in a cognizable way. In the Vaibhāṣika system, the activities of dharmas arises through the mutual interdependence of causes. Thus, their intrinsic natures are said to be "feeble", which means they are not able to act on their own, and their activity is dependent on other dharmas.
A particularly unique feature of the Vaibhāṣika system is their acceptance of simultaneous causation. These "co-existent causes" are an important part of the Sarvāstivāda understanding of causality. It allowed them to explain their theory of direct realism, that is to say, their affirmation that we perceive real external objects. It also was used in their defense of temporal eternalism. Thus, it was central to their understanding of cause and effect. For thinkers like Saṃghabhadra, a sense organ and its object must exist at the same moment together with its effect, the perception. Thus, for a cause to be efficacious, it must exist together with its effect. This view of simultaneous causation was rejected by the Sautrāntikas, but later adopted by the Yogācāra school.
The Six Causes
Efficient cause (kāraṇa-hetu). According to Dhammajoti, "It is any dharma that either directly or indirectly — by not hindering — contributes to the arising of another dharma." Vasubandhu defines it as: "A conditioned dharma has all dharma‑s, excepting itself, as its efficient cause, for, as regards its arising, [these dharma‑s] abide in the state of non-obstructiveness." This is type of cause is rejected by Sautrāntikas like Śrīlāta.
Homogeneous cause (sabhāga-hetu). This refers to the kind of causality in which an effect is of the same moral type as the previous cause in a series. Thus, in the series c1 → c2 → c3, if c1 is skillful, it is the homogeneous cause for c2 which is also skillful, and so on. According to Vaibhāṣika, this form of causality exists among mental and material dharmas, but Sautrāntikas deny that it can apply to material dharmas.
Universal cause (sarvatraga-hetu). This is similar to the homogeneous cause in that it is a cause that produces the same kind of effect, however, it only applies to defiled dharmas. Another way it is distinct from the homogeneous is that there is "no necessary homogeneity in terms of category of abandonability." This is because, as Saṃghabhadra says in the Nyāyānusāra, "they are the cause of [defiled dharma‑s] belonging to other categories as well, for, through their power, defilements belonging to categories different from theirs are produced."
Retribution cause (vipāka-hetu). This is the skill or unskillful dharmas that are karmic causes, and thus lead to good or bad karmic retribution. For Vaibhāṣikas, retribution causes and their fruits comprise all five aggregates. Sautrāntikas held that retribution cause is only volition (cetanā), and retribution fruit comprises only sensation (vedanā).
Co-existent cause (sahabhū-hetu). This is a new causal category developed by Sarvāstivāda. The Mahāvibhāṣa states that the intrinsic nature of the co-existent cause is "all the conditioned dharma‑s." Saṃghabhadra's Nyāyānusāra states that this refers to those causes "that are reciprocally virile effects, on account of the fact that they can arise by virtue of mutual support … For example: the four Great Elements are co-existent cause mutually among themselves … for it is only when the four different kinds of Great Elements assemble together that they can be efficacious in producing the derived matter (upādāya rūpa)... In this way, the whole of the conditioned, where applicable (i.e., where a mutual causal relationship obtains) are co‑existent causes." Another sense in which they are co-existent is because they come together to produce a common effect, they function together as causes at the time of the arising of a dharma.
Conjoined cause (saṃprayuktaka-hetu). This refers to co-existent causes in the mental domain of citta-caittas. According to Saṃghabhadra: "This [conjoined] cause is established because thought and thought concomitants, being conjoined, accomplish the same deed by grasping the same object."
The Four Conditions
Saṃghabhadra argues that even though the arising of dharmas depends on numerous conditions, the Buddha taught only four conditions in the sutras. Against the Sautrāntikas, who held that these were mere conceptual designations, Vaibhāṣikas assert that they are real existents.
The four conditions are first found in Devaśarman’s Vijñānakāya (ca. 1st C.E.) and they are:
Condition qua cause (hetu-pratyaya). According to Dhammajoti, "This is the condition in its capacity as direct cause in the production of an effect — it is the cause functioning as the condition." This condition subsumes all causes, except the efficient cause.
Equal-immediate condition (samanantara-pratyaya). This refers to a mental process (a citta or caitta) that is a condition for the arising of the next mental process. Dhammajoti: "It both gives way to and induces the arising of the next citta-caitta in the series." For Vaibhāṣikas, this does not apply to matter, but Sautrāntikas argued that it does.
Condition qua object (ālambana-pratyaya). This refers to the fact that cognition cannot arise without an object and thus "in this sense, the object serves as a condition for the cognition." Since the mind can take any object, "the condition qua object is none otherthan the totality of dharma‑s (Saṃghabhadra)."
Condition of dominance (adhipati-pratyaya). Dhammajoti defines it thus: "This is the most comprehensive or generic condition, corresponding to efficient cause: It is whatever serves as a condition, either in the sense of directly contributing to the arising of a dharma, or indirectly through not hindering its arising. From the latter perspective, the unconditioned dharma‑s — although transcending space and time altogether — are also said to serve as conditions of dominance."
Five Fruits
The Sarvāstivāda also taught that there are five fruits i.e. causal effects:
Disconnection fruit (visaṃyogaphala). This refers to disconnection from the defilements, and is acquired through the practice of the noble path which leads to the acquisition of the dharma "cessation through deliberation" (pratisaṃkhyā-nirodha).
Virile fruit (puruṣakāra-phala). This is related to the co-existent cause and the conjoined cause. According to Vasubandhu it is "That which is the activity or efficacy (kāritra) of a dharma; [so called] because it is like a virile action."
Fruit of dominance (adhipati-phala). This is the most generic fruit, they are produced by efficient causes. According to Dhammajoti, "the fruits commonly shared by a collection of beings by virtue of their collective karma‑s belong to this category. Thus, the whole universe with all its planets, mountains and oceans, etc., is the result — the fruit of dominance — of the collective karma‑s of the totality of beings inhabiting therein."
Uniform-emanation fruit (niṣyanda-phala). This is a fruit issued from a cause of a similar nature, it is correlated to the homogeneous cause and the universal cause.
Retribution fruit (vipāka-phala). This fruit only deals with individual sentient beings (sattvākhya), and is correlated with the retribution cause.
Epistemology
The Vaibhāṣika epistemology defended a form of realism that is established through experience. Their theory of knowledge held that one could know dharmas as unique forces with unique characteristics by two means of knowledge (pramāṇa): direct perception (which includes spiritual vision) or inference (anumāna), which relies on direct experience.
For Vaibhāṣikas like Saṃghabhadra “the characteristic of an existent (sal-lakṣaṇa) is that it can serve as an object producing cognition (buddhi)”. Because of this, an object of knowledge is necessarily existent, though it can be either a true existent (dravyata) or a conceptual existent (prajñapti). As Dhammajoti notes, "the possibility of knowing an object necessarily implies the true ontological status of the object."
This view was rejected by Sautrāntikas like Śrīlāta, who argued that a cognitive object could be unreal, pointing to examples such as optical illusions, dreams, the false cognition of a self or really existent person (pudgala), and so on. The Vaibhāṣika response to this is that even in the case of such mistaken cognitive constructs, there is a real basis which acts as part of the causal process. As explained by Dhammajoti: An absolute non-existent (atyantam asad) has no function whatsoever and hence can never engender a consciousness. Thus, in the case of the perception of the unreal pudgala, the perceptual object is not the pudgala which is superimposed, but the five skandha‑s which are real existents. Furthermore, as noted by Dhammajoti: "sensory perception as a pratyakṣa experience is fully accomplished only in the second moment on recollection." This is because the external object must first be experienced by "direct perception supported by a sense faculty" (indriyāśrita-pratyakṣa) before a discerning perception (buddhi-pratyakṣa) can arise, since the discerning perception uses the previous sense faculty perception as a cognitive support (ālambana).
Vaibhāṣika defended the real existence of external objects by arguing that mental defilements arise in different ways because of the causal force of the mind's intentional object. Likewise, sensory perception (pratyakṣa) is said to arise due to various causes and conditions, one of which is a real external object. According to Dhammajoti, for Vaibhāṣikas like Saṃghabhadra, "a sensory consciousness necessarily takes a physical assemblage or agglomeration of atoms (he ji 和集; *saṃcaya, *saṃghāta, *samasta). What is directly perceived is just these atoms assembled together in a certain manner, not a conceptualized object such as a jug, etc."
For Vaibhāṣika knowledge (jñāna) is a caitta (mental factor) that has the distinguishing characteristic of being "understanding that is decisive or definite (niścita)". There are various kinds of knowledge, for example, dharma-knowledge (dharma-jñāna), is the knowledge that realizes the true nature of dharmas, conventional-knowledge (saṃvṛti-jñāna) deals with conventional (not ultimate) things and knowledge of non-arising (anutpāda-jñāna) refers to the knowledge one has when one knows nirvana has been achieved.
Defilement (kleśa)
The goal of Buddhism is often seen as the freedom from suffering which arises from the complete removal of all defilements (kleśa). This is a state of perfection that is known by an arhat or Buddha through the "knowledge of the destruction of the outflows" (āsravakṣaya-jñāna). Ābhidharmikas saw the Abhidharma itself, which in the highest sense is just wisdom (prajñā), as the only means to end the defilements.
Kleśa is commonly defined as that which "soils" or defiles as well as that which disturbs and afflicts a psycho-physical series. Another important synonym for defilement is anuśaya, which is explained by Vaibhāṣikas as a subtle or fine (aṇu) dharma that adheres and grows with an object, "like the adherence of dust on a wet garment or the growth of seeds in an irrigated field". This is in contrast to other interpretations of anuśaya, such as that of the Sautrāntikas, who saw them as "seeds" (bīja) of kleśas. Thus, for Vaibhāṣikas there is no such thing as a latent defilement.
The defilements are seen as the root of existence (mūlaṃ bhavasya), since they produce karma, which in turn leads to further rebirths. The most fundamental defilements are known as the three unskillful roots (akuśala-mūla), referring to greed (rāga), hostility (pratigha) and ignorance (avidyā). Out of these, ignorance is the most fundamental of all. It is defined by Saṃghabhadra as "a distinct dharma which harms the capability of understanding (prajñā). It is the cause of topsy-turvy views and obstructs the examination of merits and faults. With regard to dharma-s to be known it operates in the mode of disinclination, veiling the thought and thoughtconcomitants."
According to Dhammajoti, other major terms used to describe defilements are: 1. fetter (saṃyojana); 2. bondage (bandhana); 3. envelopment (paryavasthāna); 4. outflow (āsrava); 5. flood (ogha); 6. yoke (yoga); 7. clinging (upādāna); 8. corporeal tie (kāya-grantha); 9. hindrance (nivaraṇa). These numerous categories are used to describe various doctrinal topics and create a taxonomy of dharmas. For example, all dharmas are either with or without outflows (āsrava), which are dharmas that keep sentient beings flowing on through existence and also cause impurities to flow through the sense fields.
These are also further divided into sub-categories. For example, there are three āsrava types: sensuality-outflow (kāmāsrava), existence-outflow (bhavāsrava) and ignorance-outflow (avidyāsrava); there are four clingings: sensuality-clinging (kāmopādāna), view-clinging (dṛṣṭy-upādāna), clinging to abstentions and vows (śīlavratopādāna), and Soul-theory-clinging (ātmavādopādāna); and there are five hindrances: (i) sensual-desire, (ii) malice, (iii) torpor-drowsiness (styāna-middha), (iv) restlessness-remorse (auddhatyakaukṛtya), and (v) doubt.
For Vaibhāṣikas, the elimination of the defilements thus begins with an investigation into the nature of dharmas (dharma-pravicaya). This examination is carried out in various ways, such as investigating how defilements arise and grow, what its cognitive objects are, and whether a defilement is to be abandoned by insight into the four noble truths (darśanapraheya) or by cultivation (bhāvanāpraheya).
In the Vaibhāṣika system, the abandonment of a defilement is not the complete destruction of it, since all dharmas exist throughout the three times. Instead, one becomes defiled when the dharma of acquisition links one with the defilement (saṃyoga), and one abandons the defilement when there is both the ceasing of the dharma of acquisition as well as the arising of the acquisition of disconnection (visaṃyoga-prāpti). While the abandonment of a dharma happens at once and is not repeated, the acquisition of disconnection can take place over and over again, reflecting deeper and firmer spiritual progress.
This is important because as Dhammajoti notes, Vaibhāṣikas affirm that "freedom from duḥkha must be gained by gradually and systematically abandoning the defilements" and reject the view that awakening happens abruptly. There are four methods of abandoning a defilement, the first three deal with abandonment by insight (darśana-heya):
ālambana-parijñāna: Complete understanding of the nature of the object due to which the defilement arises.
tadālambana-saṃkṣaya: The destruction of a defilement which is the object of another defilement along with the destruction of the latter (the subject).
ālambana-prahāṇa: The abandonment of a defilement that takes as object another defilement by abandoning the latter — the object.
pratipakṣodaya: The abandonment of a defilement on account of the arising of its counteragent. This is specifically applied to the defilements that are abandoned by cultivation (bhāvanā-heya).
Karma
While the Vaibhāṣikas acknowledge the profound and ultimately inconceivable nature of karma, they still attempted to give a rational account of its basic workings and to show how it was a middle way between determinism and absolute freedom. The Mahāvibhāṣa (MVŚ) notes that there are different but related ways in which the term karma is used. It can refer to actions in a general sense and it can refer specifically to ethical actions which have desirable or undesirable effects.
Karma is also used to refer to the actual retribution causes (vipāka‑hetu) of actions, which according to Dhammajoti, play a crucial role "in determining the various spheres (dhātu), planes (gati) and modes of birth (yoni) of a sentient being’s existence and in differentiating the various types of persons (pudgala) with their various life-span, physical appearances, social status, etc."
It is also important to note that, karma is not the only contributing factor to rebirth, as Vasubandhu states: "It is not karma alone which is the projector of a birth (janman)." Karma is also related to the defilements since the defilements act as the generating cause and supporting condition for karma.
Classifications
There are three main types of karma: bodily, vocal and mental. Out of all the different elements of karma, it is the volitional aspect (abhisam-√kṛ, cetanā), which comprises all mental karma, that is the most central and fundamental, since it is originates and assists the other types of karma. Saṃghabhadra, citing the sutras, states that volition (i.e. mental karma) is karma "in the proper or specific sense inasmuch as it is the prominent cause (*viśiṣṭa-hetu) in projecting a sentient existence."
The Vaibhāṣikas also had further classifications of the different types of karma. For example, there are:
Volitional karma (cetanā) and karma subsequent to willing (cetayitvā);
Informative (vijñapti) and non‑informative (avijñapti) karma. This refers to bodily and vocal actions which inform others of the corresponding mental state.
Skillful (kuśala), unskillful (akuśala) and morally neutral (avyākṛta) karmas.
Karmas which are with-outflow (sāsrava) and outflow-free (anāsrava) karmas.
Determinate (niyata) and indeterminate (aniyata) karma.
Karma that is done (kṛta) and karma that is accumulated (upacita).
Projecting (ākṣepaka) and completing (paripūraka) karmas.
The informative and non-informative category is particularly important. For the Vaibhāṣika, both types are real entities and are included as cetayitvā karma. Also, the nature of informative karma is material, it is the specific bodily shape at the time of the accomplishment of an action (which includes sound). Saṃghabhadra defends this by arguing that if all karma is mere volition (as held by Sautrāntika), then as soon as one has the intention to kill, this is the same as committing the deed. Vaibhāṣikas also held that non-informative karma was a kind of subtle "non-resistant" matter which preserved karmic efficacy, a view that was vigorously attacked by the Sautrāntikas.
Like other Buddhist schools, the Vaibhāṣikas taught the ten paths of karma as a major ethical guide to what should be avoided and what should be cultivated. It should be emphasized that volition remains the core of this teaching, that is, even if one avoids acting on one's harmful intentions, the intention itself remains an unskillful karma.
Karma through time
The Vaibhāṣika theory of karma is also closely related to their theory of tri-temporal existence, since karmas also exist in the past and in the future. Indeed, the efficacy of past karma is part of their argument for "all exists", since, for the Vaibhāṣika, if a past karmic retributive cause ceases to exist completely, it cannot lead to the karmic effect or fruit. As Dhammajoti explains:At the very moment when a retributive cause arises, it determines the causal connection with the fruit-to-be; i.e., ‘it grasps the fruit’. At a subsequent time, when the necessary conditions obtain, it, although past, can causally actualize the fruit by dragging it, as it were, out of the future into the present; i.e., ‘it gives the fruit’.This was of course rejected by the Sautrāntikas, who posited a competing theory, known as the theory of seeds, which held that a volition creates a chain of momentary dharmas called seeds, which are continuously transmitted in the mind stream until they sprout, producing the karmic effect.
Saṃghabhadra critiques this theory by pointing out that when a seed turns into a plant, there is no interruption in the process. But in the Sautrāntika view, there can be an interruption, as when a person has thoughts of a different ethical type or when they enter into meditations that completely interrupt mental activity (such as asaṃjñi-samāpatti or nirodha-samāpatti). And since Sautrāntikas are presentists, the past karma has also ceased to exist at this point and thus cannot be a cause for its fruit.
Karmic retribution
In Vaibhāṣika Abhidharma, the nature of karmic retribution, i.e. how a person experiences the results of their actions, is not fixed and depends on different conditions, such as the spiritual status and wisdom of the person. There are six factors that effect the gravity of karmic retribution (and subsequently, how bad one's future rebirth is):
The actions performed after the major karmic act.
The status of the ‘field’ (kṣetra-viśeṣa), referring to the ethical and spiritual status of the person.
The basis (adhiṣṭhāna), which is the act itself.
The preparatory action (prayoga) leading up to the main act.
Volition (cetanā), the intentional mental force behind the act.
The strength of the intention (āśaya-viśeṣa).
There are also said to be some karmas that may or may not lead to retribution at all, these are indeterminate (aniyata) karmas which are contrasted with determinate karmas, i.e. those that necessarily cause retribution (whether in this life, in the next or in some further life). These indeterminate karmas can be rendered weak or fruitless through the practice of the spiritual path. The "Salt-simile sutra" (Loṇa-phala-sutta) is cited in support of this. Determinate karmas are particularly dark acts, such as killing one's parents, which cannot be so transformed.
Another important distinction here is that between karma that is done (kṛta) which refers to preparatory and principal actions, and karma that is accumulated (upacita) which refers to the consecutive actions which "complete" the action. For example, one may prepare to kill someone and attempt to do so, but fail. In this sense, the action is not accumulated. Also, an action not done intentionally is not accumulated. Though the preparation is still a bad karma, it is not necessarily retributive. If however, something willed and accomplished is necessarily retributive.
Yet another key distinction is that between projecting (ākṣepaka) and completing (paripūraka) karmas. A projecting karma is a single act which is the principal cause that projects one's future existence (as well as for the intermediate existence, the antarā-bhava), while completing karmas are responsible for specific experiences within that one existence, such as lifespan.
Finally, it is important to note that in this system, karma is primarily individual. That is to say, one person's karma will not cause a retribution fruit to be experienced by another person.
However, there is a karmic fruit which is experienced by a collective of individuals, which is the fruit of dominance (adhipati-phala), which affects the vitality and durability of external things, such as plants and planets. This is used to explain how, when persons do good actions, the external world is affected by the "four increases": "of lifespan, of sentient beings, of external items of utility and enjoyment (pariṣkāra), and of skillful dharma‑s." In this sense then, there is "collective karma." Thus, for the Vaibhāṣikas, the whole universe is the collective karma (i.e. the fruit of dominance) of all beings living in it.
Dependent Origination
The Sarvāstivāda Abhidharma interpretation of the key Buddhist doctrine of Dependent Origination (pratītya-samutpāda) focuses on how the 12 links (nidāna) contribute to rebirth from the perspective of three periods of existence (past, present, future). This is explained in the following way:
Past causes
1. ignorance (avidyā), represents all the defilements in one's past life, since all defilements are conjoined with and caused by ignorance.
2. conditionings (saṃskāra), this refers to all past karmic constructions driven by ignorance.
Present effects
3. consciousness (vijñāna), this specifically refers to the consciousness that enters the womb at the moment of rebirth.
4. psycho-physical complex (nāma-rūpa), represents the body and mind, particularly as it develops in the womb
5. six sense fields (ṣaḍāyatana), refers to the five senses and the mental sense
6. contact (sparśa), refers to contact between the sense faculties and their objects
7. sensation (vedanā), refers to different pleasant, unpleasant and neutral sensations
Present causes
8. craving (tṛṣṇā), craving for sensuality, desire for material things and sex
9. grasping (upādāna), strong clinging for the objects of craving
10. existence (bhava), refers to all present karmas that project a future existence
Future effects
11. birth (jāti), represents the first re-linking consciousness in a future birth
12. old-age-and-death (jarā-maraṇa), represents everything that happens from future rebirth until death.
Though presented in a linear way in the form of a list, these factors are said to be mutually conditioning among each other in various interconnected ways.
Though the three life model, also called "prolonged" (prākarṣika), is the most widely used way of understanding dependent origination, Sarvāstivāda Abhidharmikas also accepted three other ways of explaining it:
Momentary (kṣaṇika): the 12 links are explained as being present within a single mind moment.
Pertaining to states (āvasthika): This model states that the five aggregates are present in each of the 12 links. Each link is so named because it is the predominant force among the aggregates at that moment, and thus the entire collection of aggregates is given the name ignorance (and so on) at that point in time.
Connected (sāṃbandhika): Refers to how the 12 links are conjoined with the entire field of causes and effects, i.e. "all conditioned dharmas" or the whole of phenomenal existence.
Spiritual path
The study of the nature and function of spiritual paths is important to Abhidharma. For the Vaibhāṣikas the spiritual path is a gradual process of abandoning the defilements; there is no "sudden enlightenment". The analysis of the various spiritual paths provided by the Vaibhāṣika Abhidharma correspond to the abandoning of various defilements.
The beginning of the path consists of preliminary practices: approaching "true persons", listening to the Dharma, contemplating the meaning and practicing the Dharma and what accords with the Dharma. Preparatory practices also include the observance of the ethical precepts (śīlaṃ pālayati), giving, and studying the Abhidharma.
The Mahāvibhāṣa (MVŚ) contains the following succinct explanation of the stages leading up to stream entry:At the beginning, because of his aspiration for the fruit of liberation, he diligently practices [i] giving (dāna) and the pure precepts (śīla); [ii] the understanding derived from listening, the contemplation of the impure, mindfulness of breathing and the foundations of mindfulness (smṛtyupasthāna); and [iii] warmth, summits,receptivities and the supreme mundane dharma‑s; and [then he enters into] [iv] the 15 moments of the path of vision. This is collectively said to be “firmly on one’s feet”.
Stages of the path
Vaibhāṣika developed an influential outline of the path to awakening, one which was later adapted and modified by the scholars of the Mahayana tradition into the schema of the "five paths" (pañcamārga): The original Vaibhāṣika schema is divided into seven stages of preparatory effort (prayoga) and four stages of spiritual fruits (phala):
The Seven prayogas:
Mokṣabhāgīya ("conducing to liberation") refers to meditations which are causes for liberation, mainly calm and insight. These are not completely separate and can exist together in the same thought. They are also said to constitute the wisdom (prajñā) derived from cultivation. These are outlined as follows:
Śamatha (calming meditation) practices, mainly contemplation on the impure (aśubha-bhāvanā) and mindfulness of breathing (ānāpānasmṛti), but also includes other meditations such as loving kindness (maitrī).
Vipaśyana (insight meditation), consisting of the fourfold application of mindfulness (smṛtyupasthānas) practiced one at a time, contemplating how they are impure, unsatisfactory, impermanent and without a Self.
In a more advanced stage of vipaśyana, one meditates on the four smṛtyupasthānas at the same time.
Nirvedhabhāgīya ("conducing to penetration") refers to the "four skillful roots" (kuśalamūla) of the arising of out-flow free knowledge. It thus refers to that which leads stream entry, the first noble (ārya) stage of liberation. They are said to be the wisdom derived from reflection. Each one serves as a cause for the next one:
Uṣmagata (warmth) is the initial arising of "the warmth of the noble knowledge capable of burning the fuels of defilements" (MVŚ). This is a lengthy stage, where one gradually accumulates wisdom through study, contemplation and meditation on the Dharma, especially the 16 aspects of the four noble truths. At this point, one may still retrogress.
Mūrdhan (summits). One continues to contemplate the 16 modes of the four noble truths, but at the highest level of excellence, their "summit" or "peak". At this point, one may still retrogress.
Kṣānti (receptivities) is the stage of the highest level of receptivity or acceptance of the four noble truths. One is so receptive to them that one can no longer retrogress from accepting them. There are various receptivities covering the sense sphere as well as the upper spheres of existence.
Laukikāgradharma (supreme mundane dharmas). These dharmas contemplate the unsatisfactoriness of the sphere of sensuality and refer to those dharmas that are a condition for the arising of the darśana-mārga (path of vision).
The four phala:
Each has two stages, the candidacy stage and the fruit stage.
Srotaāpatti (stream-enterer).
The candidate for the fruit of stream-entry (srotaāpatti-phalapratipannaka), also known as the darśana-mārga (path of vision).
The “abider in the fruit of stream entry” (srotaāpatti-phala-stha). At this point, one has entered the bhāvanā-mārga (path of cultivation), where one gradually eliminates all the remaining defilements.
Sakṛdāgāmin (once returner), both stages fall within the bhāvanā-mārga.
Anāgāmin (non-returner), both stages also fall within the bhāvanā-mārga.
Arhat. Its candidacy stage is part of the bhāvanā-mārga, but the phala stage is known as aśaikṣa-mārga (path of no more learning).
In the prayoga stages, the contemplation of the four noble truths was done with knowledge that are with-outflow (sāsrava). Immediately after the last prayoga stage, one is able to access outflow-free knowledges (anāsrava-jñāna), and must apply these to the noble truths. This is known as direct realization (abhisamaya), direct spiritual insight into the intrinsic and common characteristics of the four truths. This takes 16 thought moments. Insight into the truths is achieved in two moments called "paths". Dhammajoti explains them as follows:
In the first moment, called the unhindered path (ānantarya-mārga), the outflow-free understanding that arises is called a receptivity (kṣānti) to knowledge, and with this, the defilements abandonable by vision into the particular truth are abandoned. In the following moment, called the path of liberation (vimukti-mārga), knowledge proper arises through the induction of which the acquisition (prāpti) of the cessation through deliberation (pratisaṃkhyā-nirodha) of the defilements arises. In this way, for the whole contemplative process covering the sphere of sensuality followed by the two upper spheres, there arise eight receptivities and eight knowledges, all being prajñā in their intrinsic nature.
From the first moment of insight, which is the first moment of receptivitity, one is said to be an ārya, a noble being. This is because the out-flow free path has arisen in them and thus they are no longer an ordinary worldling (pṛthagjanatva). Also, according to this system, when one has entered into stream entry, there is no going back, no retrogression. Regarding arhatship, some arhats can retrogress, mainly those who, due to their weak faculties, entered the path as a "pursuer through faith" (śraddhānusārin). Those who have sharp faculties and have studied and understood the teachings (dharmānusārins) are not retrogressible, they are ‘ones liberated through wisdom’ (prajñā-vimukta).
The three vehicles and noble beings
The Vaibhāṣika Sarvāstivādins are known to have employed schema of the Three Vehicles, which can be seen in the Mahāvibhāṣā:
Śrāvakayāna – The vehicle of the disciples, who reach the attainment of an Arhat.
Pratyekabuddhayāna – The vehicle of the "Solitary Buddhas".
Bodhisattvayāna – The vehicle of the beings who are training to become a fully enlightened buddha (samyaksambuddha).
The Vaibhāṣikas held that though arhats have been fully liberated through the removal of all defilements, their wisdom (prajñā) is not fully perfected and thus inferior to a Buddha's wisdom. Also, arhats have subtle traces (vāsanā) that the defilements have left behind after they have been abandoned. Thus, for Vaibhāṣikas, arhats are said to have a certain non-defiled ignorance (akliṣṭājñāna), which Buddhas lack. Furthermore, a Buddha has both omniscience (sarvajñā) and ‘wisdom of all modes’ (sarva‑ākāra‑jñāna), i.e. a knowledge of all the spiritual paths.
The inferiority of the arhat attainment can be seen in texts such as the Sarvāstivādin Nāgadatta Sūtra, which critiques the Mahīśāsaka view of women in a narrative about a bhikṣuṇī named Nāgadatta. Here, the demon Māra takes the form of her father, and tries to convince her to work toward the lower stage of an arhat. Nāgadatta rejects this, saying, "A Buddha's wisdom is like empty space of the ten-quarters, which can enlighten innumerable people. But an Arhat's wisdom is inferior."
However, against the docetic view of the Mahāsāṃghikas, the Sarvāstivādins viewed the Buddha's physical body (Skt. rūpakāya) as being impure and improper for taking refuge in, and they instead regarded taking refuge in the Buddha as taking refuge in bodhi itself (awakening) and also in the Dharmakāya (body of the teaching).
The Sarvāstivādins also admitted the path of a bodhisattva as a valid one. References to the bodhisattva path and the practice of the six pāramitās are commonly found in Sarvāstivāda works. The Mahāvibhāṣā of the Vaibhāṣika Sarvāstivādins includes a schema of four pāramitās: generosity (dāna), discipline (śīla), energy (vīrya), and wisdom (prajñā), and it says that the four pāramitās and six pāramitās are essentially equivalent (seeing patience as a kind of discipline and meditation as a kind of intuitive wisdom).
References
Notes
Sources
Dhammajoti, Bhikkhu K.L. (2009) Sarvāstivāda Abhidharma. Centre of Buddhist Studies, The University of Hong Kong.
Westerhoff, Jan (2018) The Golden Age of Indian Buddhist Philosophy in the First Millennium CE, pp. 60–61.
Willemen, Charles; Dessein, Bart; Cox, Collett (1998). Sarvāstivāda Buddhist Scholasticism, Handbuch der Orientalistik. Zweite Abteilung. Indien.
Early Buddhist schools
Nikaya schools
Buddhism in India
Abhidharma
Sarvāstivāda |
Zonti is a village in Istria, Croatia.
Demographics
According to the 2021 census, its population was 40.
References
Populated places in Istria County |
Modibo Keita International Airport (formerly Bamako–Sénou International Airport) is Mali's main airport located approximately south of downtown Bamako, the capital of Mali in West Africa. It is the country's only international airport. It is managed by Aéroports du Mali (ADM). Its operations are overseen by the Malian Ministry of Equipment and Transport.
History
Bamako-Sénou Airport was opened to traffic in 1974. The airport was upgraded between 2007 and 2012 in a US$181 million project funded by the Millennium Challenge Corporation, a United States foreign aid agency.
Military base
Bamako–Sénou International Airport is adjacent to Air Base 101, which is used by the Mali Air Force.
Statistics
Passenger traffic steadily increased in the early 2000s. Government figures show 403,380 passengers in 1999, 423,506 in 2003, 486,526 in 2004, and 516,000 in 2005. In 2006 it was predicted to reach over 900,000 by 2015 under a low (4%) yearly growth rate scenario.
Total air traffic at BKO increased by 12.4% in 2007 and 14% in 2008. Most of this increase came in passenger transport, with the number of passengers served increasing by 20% in 2007 and 17% in 2008. Twenty-seven airline carriers operated weekly or better at BKO in the 2007–2008 period. This continued growth was offset by cargo flights' decline of 16.75% in 2007, and 3.93% in 2008.
Airlines and destinations
Passenger
Cargo
Accidents and incidents
On 24 July 1971, a Douglas C-47A (6V-AAP) of Air Ivoire crashed into a hill 67 seconds after take-off from runway 24 at night. The aircraft was operating a scheduled passenger flight. All six occupants were killed.
On 31 May 1981, a Dassault Falcon 20C (7T-VRE) of the Algerian government crashed 8 km (5 miles) from here on approach, killing 3 of the 6 occupants. The plane was on an official state flight, carrying foreign minister Mohamed Seddick Benyahia. Benyahia survived, but was killed the following year in a shootdown.
On 30 June 1996, a Boeing 707-369C (5X-JON) of Air Afrique leased from DAS Air Cargo became unstable shortly after landing due to a sudden burst of rain and veered off the runway, striking a bunker and detaching the right wing. All 4 occupants survived; the plane was written off.
In October 2007 (day unknown), an Ilyushin Il-76TD (5A-DNQ) of Jamahiriya Air Transport sustained serious damage on landing when the nose gear collapsed. The plane was later repaired.
On 14 June 2017, a Beechcraft 200 Super King Air (TZ-DDG) of Malian Aero Company suffered a landing accident after returning from a cloud-seeding operation over Mopti at 14:05. The plane came to rest on the right side of the runway with substantial propeller damage and was subsequently written off. The sole occupant survived.
References
External links
A–Z World Airports: Bamako – Senou Int'l Airport (BKO/GABS)
Aeronautical charts for BKO/GABS from ASECNA
Avient Aviation Scheduled Flights
Airports established in 1974
Airports in Mali
Buildings and structures in Bamako
1974 establishments in Mali |
```python
#!/usr/bin/env python3
# Distributed under the MIT software license, see the accompanying
# file COPYING or path_to_url
"""
ZMQ example using python3's asyncio
VERGE should be started with the command line arguments:
verged -testnet -daemon \
-zmqpubrawtx=tcp://127.0.0.1:220103 \
-zmqpubrawblock=tcp://127.0.0.1:220103 \
-zmqpubhashtx=tcp://127.0.0.1:220103 \
-zmqpubhashblock=tcp://127.0.0.1:220103
We use the asyncio library here. `self.handle()` installs itself as a
future at the end of the function. Since it never returns with the event
loop having an empty stack of futures, this creates an infinite loop. An
alternative is to wrap the contents of `handle` inside `while True`.
"""
import binascii
import asyncio
import zmq
import zmq.asyncio
import signal
import struct
import sys
if not (sys.version_info.major >= 3 and sys.version_info.minor >= 5):
print("This example only works with Python 3.5 and greater")
sys.exit(1)
port = 220103
class ZMQHandler():
def __init__(self):
self.loop = asyncio.get_event_loop()
self.zmqContext = zmq.asyncio.Context()
self.zmqSubSocket = self.zmqContext.socket(zmq.SUB)
self.zmqSubSocket.setsockopt_string(zmq.SUBSCRIBE, "hashblock")
self.zmqSubSocket.setsockopt_string(zmq.SUBSCRIBE, "hashtx")
self.zmqSubSocket.setsockopt_string(zmq.SUBSCRIBE, "rawblock")
self.zmqSubSocket.setsockopt_string(zmq.SUBSCRIBE, "rawtx")
self.zmqSubSocket.connect("tcp://127.0.0.1:%i" % port)
async def handle(self) :
msg = await self.zmqSubSocket.recv_multipart()
topic = msg[0]
body = msg[1]
sequence = "Unknown"
if len(msg[-1]) == 4:
msgSequence = struct.unpack('<I', msg[-1])[-1]
sequence = str(msgSequence)
if topic == b"hashblock":
print('- HASH BLOCK ('+sequence+') -')
print(binascii.hexlify(body))
elif topic == b"hashtx":
print('- HASH TX ('+sequence+') -')
print(binascii.hexlify(body))
elif topic == b"rawblock":
print('- RAW BLOCK HEADER ('+sequence+') -')
print(binascii.hexlify(body[:80]))
elif topic == b"rawtx":
print('- RAW TX ('+sequence+') -')
print(binascii.hexlify(body))
# schedule ourselves to receive the next message
asyncio.ensure_future(self.handle())
def start(self):
self.loop.add_signal_handler(signal.SIGINT, self.stop)
self.loop.create_task(self.handle())
self.loop.run_forever()
def stop(self):
self.loop.stop()
self.zmqContext.destroy()
daemon = ZMQHandler()
daemon.start()
``` |
```go
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
package bcc
import (
"fmt"
"regexp"
"sync"
"unsafe"
)
/*
#cgo CFLAGS: -I/usr/include/bcc/compat
#cgo LDFLAGS: -lbcc
#include <bcc/bcc_common.h>
#include <bcc/libbpf.h>
#include <bcc/bcc_syms.h>
extern void foreach_symbol_callback(char*, uint64_t);
*/
import "C"
type symbolAddress struct {
name string
addr uint64
}
//symbolCache will cache module lookups
var symbolCache = struct {
cache map[string][]*symbolAddress
currentModule string
lock *sync.Mutex
}{
cache: map[string][]*symbolAddress{},
currentModule: "",
lock: &sync.Mutex{},
}
type bccSymbol struct {
name *C.char
demangleName *C.char
module *C.char
offset C.ulonglong
}
type bccSymbolOption struct {
useDebugFile int
checkDebugFileCrc int
useSymbolType uint32
}
// resolveSymbolPath returns the file and offset to locate symname in module
func resolveSymbolPath(module string, symname string, addr uint64, pid int) (string, uint64, error) {
if pid == -1 {
pid = 0
}
modname, offset, err := bccResolveSymname(module, symname, addr, pid)
if err != nil {
return "", 0, fmt.Errorf("unable to locate symbol %s in module %s: %v", symname, module, err)
}
return modname, offset, nil
}
func bccResolveSymname(module string, symname string, addr uint64, pid int) (string, uint64, error) {
symbol := &bccSymbol{}
symbolC := (*C.struct_bcc_symbol)(unsafe.Pointer(symbol))
moduleCS := C.CString(module)
defer C.free(unsafe.Pointer(moduleCS))
symnameCS := C.CString(symname)
defer C.free(unsafe.Pointer(symnameCS))
res, err := C.bcc_resolve_symname(moduleCS, symnameCS, (C.uint64_t)(addr), C.int(pid), nil, symbolC)
if res < 0 {
return "", 0, fmt.Errorf("unable to locate symbol %s in module %s: %v", symname, module, err)
}
return C.GoString(symbolC.module), (uint64)(symbolC.offset), nil
}
func bccResolveName(module, symname string, pid int) (uint64, error) {
symbol := &bccSymbolOption{}
symbolC := (*C.struct_bcc_symbol_option)(unsafe.Pointer(symbol))
pidC := C.int(pid)
cache := C.bcc_symcache_new(pidC, symbolC)
defer C.bcc_free_symcache(cache, pidC)
moduleCS := C.CString(module)
defer C.free(unsafe.Pointer(moduleCS))
nameCS := C.CString(symname)
defer C.free(unsafe.Pointer(nameCS))
var addr uint64
addrC := C.uint64_t(addr)
res := C.bcc_symcache_resolve_name(cache, moduleCS, nameCS, &addrC)
if res < 0 {
return 0, fmt.Errorf("unable to locate symbol %s in module %s", symname, module)
}
return addr, nil
}
// getUserSymbolsAndAddresses finds a list of symbols associated with a module,
// along with their addresses. The results are cached in the symbolCache and
// returned
func getUserSymbolsAndAddresses(module string) ([]*symbolAddress, error) {
symbolCache.lock.Lock()
defer symbolCache.lock.Unlock()
// return previously cached list if it exists
if _, ok := symbolCache.cache[module]; ok {
return symbolCache.cache[module], nil
}
symbolCache.cache[module] = []*symbolAddress{}
symbolCache.currentModule = module
if err := bccForeachSymbol(module); err != nil {
return nil, err
}
return symbolCache.cache[module], nil
}
func matchUserSymbols(module, match string) ([]*symbolAddress, error) {
r, err := regexp.Compile(match)
if err != nil {
return nil, fmt.Errorf("invalid regex %s : %s", match, err)
}
matchedSymbols := []*symbolAddress{}
symbols, err := getUserSymbolsAndAddresses(module)
if err != nil {
return nil, err
}
for _, sym := range symbols {
if r.MatchString(sym.name) {
matchedSymbols = append(matchedSymbols, sym)
}
}
return matchedSymbols, nil
}
// foreach_symbol_callback is a gateway function that will be exported to C
// so that it can be referenced as a function pointer
//export foreach_symbol_callback
func foreach_symbol_callback(symname *C.char, addr C.uint64_t) {
symbolCache.cache[symbolCache.currentModule] =
append(symbolCache.cache[symbolCache.currentModule], &symbolAddress{C.GoString(symname), (uint64)(addr)})
}
func bccForeachSymbol(module string) error {
moduleCS := C.CString(module)
defer C.free(unsafe.Pointer(moduleCS))
res := C.bcc_foreach_function_symbol(moduleCS, (C.SYM_CB)(unsafe.Pointer(C.foreach_symbol_callback)))
if res < 0 {
return fmt.Errorf("unable to list symbols for %s", module)
}
return nil
}
``` |
Her Lucky Night is a 1945 musical film starring The Andrews Sisters. It was their last film for Universal.
Plot
A story of a woman who tries to find a boyfriend.
Cast
Lawsuit
The film was part of a lawsuit by Harold Lloyd against Universal Pictures. He claims they copied sequences from his films, The Freshman, Movie Crazy and Welcome Danger in their films Her Lucky Night, So's Your Uncle and Lucky Man. Her Lucky Night was supposed to have copied The Freshman; Lloyd claimed $500,000 in general damages and $500,000 in special damages for that film in particular. Lloyd won $60,000 for the Movie Crazy-So's Your Uncle infringement; he settled with Universal for more than $100,000 for the other two films.
References
External links
American musical films
1945 films
1945 musical films
Universal Pictures films
American black-and-white films
1940s English-language films
Films directed by Edward C. Lilley
1940s American films |
, was a Sengoku period Japanese castle located in what is now part of the town of Yoshimi, Hiki District, Saitama, in the Kantō region of Japan. Its ruins have been protected as a National Historic Site, since 2008. It is also referred to as Musashi-Matsuyama Castle, to distinguish it from the more famous Bitchū Matsuyama Castle or Iyo Matsuyama Castle.
Overview
Matsuyama Castle is located on the western bank of the Arakawa River in the center of Saitama, where the river changes direction from the east to the south. On a hill protected on two sides by the river, this location had natural fortifications and was part of a defensive line (along with Hachigata Castle, Kawagoe Castle and Edo Castle) created by Ōta Dōkan to protect Kamakura from enemies to the north and east. The exact date the castle was founded is uncertain, but it is believed to be in the latter half of the 15th century. The Uesugi clan was in constant struggle against the growing power of the Later Hōjō clan, who seized Kawagoe in 1537. Uesugi Tomosada escaped from Kawagoe to Musashi-Matsuyama. He ordered that the defenses of the castle be greatly expanded, with a large number of dry moats constructed. He continued to resist the Hōjō for ten more years, but was defeated and killed in 1546 by Hōjō Ujiyasu.
After the fall of the Ogigayatsu Uesugi clan, Matsuyama Castle came under the control of the Hōjō and was a major redoubt in their incessant struggles against Uesugi Kenshin. Kenshin managed to seize Matsuyama Castle and went on to lay siege to the Hōjō stronghold at Odawara Castle, but was unable to maintain momentum and eventually withdrew to Echigo Province in 1560. The Hōjō counterattacked as part of an alliance with Takeda Shingen, but as the defenders of the castle were well-prepared, they could not easily retake it. Shingen brought in sappers from Kai Province in an attempt to undermine the defenses but his also failed. After three months, Hōjō Ujiyasu tricked the garrison by telling them at they were completely cut off from rescue, as snows has closed the passes to Echigo, and Kenshin's army was forced into winter camp. Believing this, the garrison surrendered; however, Kenshin was in fact only days away. On hearing if the surrender, Kenshin issued a challenge to combat, but as the Hōjō and Takeda had achieved their objective and were now entrenched in Matsuyama, the challenge was unanswered.
Afterwards, the castle was maintained by the Hōjō clan. It was retaken by Kenshin once, but recaptured by the Hōjō once again. At the time of the Odawara Campaign, Toyotomi Hideyoshi sent an army of 40,000 men led by Maeda Toshiie and Uesugi Kagekatsu against Musashi-Matsuyama, and its 2300 defenders surrendered without resistance. The castle was used until 1601, when it was abandoned.
At present, the outlines of the enclosures and moats remain in good shape. In 2008, the site received protection as one of the four "Hiki Fortification ruins" in Saitama, including the Sugaya Yakata, Sugiyama Castle, and Ogura Castle. The castle site is a 15-minute walk from Higashi-Matsuyama Station on the Tōbu Tōjō Line.
See also
List of Historic Sites of Japan (Saitama)
References
External links
Yoshimi Town official site
Castles in Saitama Prefecture
Yoshimi, Saitama
Historic Sites of Japan
Ruined castles in Japan
Musashi Province
Go-Hōjō clan |
```c++
//
// file LICENSE_1_0.txt or copy at path_to_url
#include <iostream>
#include <boost/pfr/traits.hpp>
#include <type_traits> // for std::true_type, std::false_type and std::is_aggregate
namespace boost { namespace pfr {
struct boost_fusion_tag;
struct boost_json_tag;
}}
struct Aggregate {};
using Nonaggregate = int;
#if defined(__cpp_lib_is_aggregate)
static_assert(std::is_aggregate<Aggregate>::value && !std::is_aggregate<Nonaggregate>::value, "");
#endif
using Reflectable = short;
struct Nonrefrectable {};
using ReflectableBoostFusion = short;
struct NonrefrectableBoostFusion {};
using ReflectableBoostJson = short;
struct NonrefrectableBoostJson {};
namespace boost { namespace pfr {
template<class All> struct is_reflectable<Reflectable, All> : std::true_type {};
template<class All> struct is_reflectable<Nonrefrectable, All> : std::false_type {};
template<> struct is_reflectable<ReflectableBoostFusion, boost_fusion_tag> : std::true_type {};
template<> struct is_reflectable<NonrefrectableBoostFusion, boost_fusion_tag> : std::false_type {};
template<> struct is_reflectable<ReflectableBoostJson, boost_json_tag> : std::true_type {};
template<> struct is_reflectable<NonrefrectableBoostJson, boost_json_tag> : std::false_type {};
}}
#if BOOST_PFR_ENABLE_IMPLICIT_REFLECTION
template<class T, class Tag>
void assert_reflectable() {
static_assert(boost::pfr::is_implicitly_reflectable_v<T, Tag>, "");
static_assert(boost::pfr::is_implicitly_reflectable_v<const T, Tag>, "");
static_assert(boost::pfr::is_implicitly_reflectable_v<volatile T, Tag>, "");
static_assert(boost::pfr::is_implicitly_reflectable_v<const volatile T, Tag>, "");
}
template<class T, class Tag>
void assert_non_reflectable() {
static_assert(!boost::pfr::is_implicitly_reflectable_v<T, Tag>, "");
static_assert(!boost::pfr::is_implicitly_reflectable_v<const T, Tag>, "");
static_assert(!boost::pfr::is_implicitly_reflectable_v<volatile T, Tag>, "");
static_assert(!boost::pfr::is_implicitly_reflectable_v<const volatile T, Tag>, "");
}
#endif // #if BOOST_PFR_ENABLE_IMPLICIT_REFLECTION
int main() {
#if BOOST_PFR_ENABLE_IMPLICIT_REFLECTION
std::cout << "Implicit reflection is available in this platform.." << std::endl;
{
using tag = boost::pfr::boost_json_tag;
assert_reflectable<Aggregate, tag>();
assert_non_reflectable<Nonaggregate, tag>();
assert_reflectable<Reflectable, tag>();
assert_non_reflectable<Nonrefrectable, tag>();
assert_reflectable<ReflectableBoostJson, tag>();
assert_non_reflectable<NonrefrectableBoostJson, tag>();
}
{
using tag = boost::pfr::boost_fusion_tag;
assert_reflectable<Aggregate, tag>();
assert_non_reflectable<Nonaggregate, tag>();
assert_reflectable<Reflectable, tag>();
assert_non_reflectable<Nonrefrectable, tag>();
assert_reflectable<ReflectableBoostFusion, tag>();
assert_non_reflectable<NonrefrectableBoostFusion, tag>();
}
#endif // #if BOOST_PFR_ENABLE_IMPLICIT_REFLECTION
}
``` |
Dmitri Nikolayevich Grachyov (, born 6 September 1983) is a former Russian footballer. He played centre back.
After seven years in the Russian First Division, late in August 2009 he proceeded to Saturn Moscow Oblast of the Premier League. He made his professional debut in the Russian First Division in 2002 for FC Fakel Voronezh.
References
1983 births
Living people
Russian men's footballers
FC Fakel Voronezh players
FC KAMAZ Naberezhnye Chelny players
FC Leon Saturn Ramenskoye players
Russian Premier League players
FC Spartak Vladikavkaz players
FC Ufa players
FC Zvezda Irkutsk players
FC Luch Vladivostok players
Men's association football defenders
FC Cherepovets players
Sportspeople from Voronezh |
Ondřej Látal (born March 15, 1981 in Ústí nad Labem) is a Czech professional ice hockey forward. He played with HC Sparta Praha in the Czech Extraliga during the 2010–11 Czech Extraliga postseason.
References
External links
1981 births
Czech ice hockey forwards
HC Sparta Praha players
Living people
Ice hockey people from Ústí nad Labem
Sportspeople from Třebíč
Ice hockey people from the Vysočina Region
Acadie–Bathurst Titan players
Stadion Hradec Králové players
SK Horácká Slavia Třebíč players
HC Kometa Brno players
HC Vrchlabí players
Czech expatriate ice hockey players in Canada |
```objective-c
#ifndef GGEOAREAMONITORSOURCE_H
#define GGEOAREAMONITORSOURCE_H
#include <QApplication>
#include <QGeoAreaMonitorSource>
extern "C" {
#include "ring.h"
}
class GGeoAreaMonitorSource : public QGeoAreaMonitorSource
{
Q_OBJECT
public:
VM *pVM;
List *pParaList;
char cEvent[100];
char cEvent[100];
char cEvent[100];
char cEvent[100];
GGeoAreaMonitorSource(QObject *parent,VM *pVM );
~GGeoAreaMonitorSource();
void geteventparameters(void) ;
void setEvent(const char *cStr);
void setEvent(const char *cStr);
void setEvent(const char *cStr);
void setEvent(const char *cStr);
const char *getEvent(void);
const char *getEvent(void);
const char *getEvent(void);
const char *getEvent(void);
public slots:
void Slot();
void Slot();
void Slot();
void Slot();
};
#endif
``` |
North–South Ski Bowl was a modest ski area in the western United States, located in northern Idaho in the Hoodoo Mountains of southern Benewah County.
Its bowl-shaped slope in the Idaho Panhandle National Forest faced northeast and the vertical drop was just under on Dennis Mountain, accessed from State Highway 6, south of Emida and north of Harvard. An "upside-down" ski area, the parking lot and lodge were at the top, less than a mile east of the highway, formerly designated as 95A (U.S. 95 Alternate). The access road meets the highway at its crest ("Harvard Hill"), just under , and climbs about ; the border with Latah County is approximately south.
History
With a day lodge built in the late 1930s by the Civilian Conservation Corps (CCC) through the Works Progress Administration (WPA),
the ski area was developed by the U.S. Forest Service, and originally owned and operated by Washington State College (Pullman is approximately southwest, about an hour by vehicle). In the early 1950s, it was known as the "St. Joe Ski Bowl," and prior to that as the "Emida Ski Bowl." After a poor snow year in 1958, it was sold to a private owner, Fred Craner and his brother, Merle, and a platter lift was added in 1959.
It was the primary training area for the WSU and UI intercollegiate ski teams, and included a ski jump. The Ramskull Ski club formed in 1960, named for the creek of the ski area. The road from the highway was improved and parking areas expanded in 1962.
Closed for the 1969–70 season, the students of WSU (ASWSU) regained ownership and operated North–South until 1980. Additions included a chairlift in 1970, and a new lodge in 1976, and the area was lit for night skiing. The area got into financial difficulty in 1979, and the students searched for a buyer. After leasing it to a private operator in 1980 for four seasons, ASWSU sold the area outright in 1984.
Present day
With an aging chairlift and inconsistent snowfall at a low elevation, alpine skiing was discontinued in the 1990s. The entrance area near the highway is now a "Park 'n' Ski" area for cross-country skiing and the top of the former ski area is home to Palouse Divide Lodge, a private conference and retreat facility.
See also
Tamarack Ski Area – near Troy (defunct)
References
External links
Palouse Divide Lodge – official site
Virtual tour
WSU.edu – daytrips – Inside Idaho
U.S. Forest Service – Palouse Divide Park 'n' Ski area
Map
Idaho Department of Transportation – camera – State Highway 6 – Harvard Hill
David Rumsey Map Collection – Historic road map (1937) – Idaho, Montana, Wyoming – Texaco (Rand McNally)
Idaho highway map (1956) – Shell (H.M. Gousha)
Ski areas and resorts in Idaho
Buildings and structures in Benewah County, Idaho
Washington State University |
Otoli is a village in Belgaum district of Karnataka, India.
References
Villages in Belagavi district |
```objective-c
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef UI_GL_GL_IMPLEMENTATION_H_
#define UI_GL_GL_IMPLEMENTATION_H_
#include <string>
#include <vector>
#include "base/native_library.h"
#include "build/build_config.h"
#include "ui/gl/gl_export.h"
#include "ui/gl/gl_switches.h"
namespace gfx {
class GLContext;
// The GL implementation currently in use.
enum GLImplementation {
kGLImplementationNone,
kGLImplementationDesktopGL,
kGLImplementationDesktopGLCoreProfile,
kGLImplementationOSMesaGL,
kGLImplementationAppleGL,
kGLImplementationEGLGLES2,
kGLImplementationEGLGLES2SwiftShader,
kGLImplementationMockGL
};
struct GL_EXPORT GLWindowSystemBindingInfo {
GLWindowSystemBindingInfo();
std::string vendor;
std::string version;
std::string extensions;
bool direct_rendering;
};
void GL_EXPORT
GetAllowedGLImplementations(std::vector<GLImplementation>* impls);
#if defined(OS_WIN)
typedef void*(WINAPI* GLGetProcAddressProc)(const char* name);
#else
typedef void* (*GLGetProcAddressProc)(const char* name);
#endif
// Initialize a particular GL implementation.
GL_EXPORT bool InitializeStaticGLBindings(GLImplementation implementation);
// Initialize function bindings that depend on the context for a GL
// implementation.
GL_EXPORT bool InitializeDynamicGLBindings(GLImplementation implementation,
GLContext* context);
// Initialize Debug logging wrappers for GL bindings.
void InitializeDebugGLBindings();
// Initialize stub methods for drawing operations in the GL bindings. The
// null draw bindings default to enabled, so that draw operations do nothing.
void InitializeNullDrawGLBindings();
// TODO(danakj): Remove this when all test suites are using null-draw.
GL_EXPORT bool HasInitializedNullDrawGLBindings();
// Filter a list of disabled_extensions from GL style space-separated
// extension_list, returning a space separated list of filtered extensions, in
// the same order as the input.
GL_EXPORT std::string FilterGLExtensionList(
const char* extension_list,
const std::vector<std::string>& disabled_extensions);
// Once initialized, instantiating this turns the stub methods for drawing
// operations off allowing drawing will occur while the object is alive.
class GL_EXPORT DisableNullDrawGLBindings {
public:
DisableNullDrawGLBindings();
~DisableNullDrawGLBindings();
private:
bool initial_enabled_;
};
GL_EXPORT void ClearGLBindings();
// Set the current GL implementation.
GL_EXPORT void SetGLImplementation(GLImplementation implementation);
// Get the current GL implementation.
GL_EXPORT GLImplementation GetGLImplementation();
// Does the underlying GL support all features from Desktop GL 2.0 that were
// removed from the ES 2.0 spec without requiring specific extension strings.
GL_EXPORT bool HasDesktopGLFeatures();
// Get the GL implementation with a given name.
GLImplementation GetNamedGLImplementation(const std::string& name);
// Get the name of a GL implementation.
const char* GetGLImplementationName(GLImplementation implementation);
// Add a native library to those searched for GL entry points.
void AddGLNativeLibrary(base::NativeLibrary library);
// Unloads all native libraries.
void UnloadGLNativeLibraries();
// Set an additional function that will be called to find GL entry points.
// Exported so that tests may set the function used in the mock implementation.
GL_EXPORT void SetGLGetProcAddressProc(GLGetProcAddressProc proc);
// Find an entry point in the current GL implementation. Note that the function
// may return a non-null pointer to something else than the GL function if an
// unsupported function is queried. Spec-compliant eglGetProcAddress and
// glxGetProcAddress are allowed to return garbage for unsupported functions,
// and when querying functions from the EGL library supplied by Android, it may
// return a function that prints a log message about the function being
// unsupported.
void* GetGLProcAddress(const char* name);
// Return information about the GL window system binding implementation (e.g.,
// EGL, GLX, WGL). Returns true if the information was retrieved successfully.
GL_EXPORT bool GetGLWindowSystemBindingInfo(GLWindowSystemBindingInfo* info);
// Helper for fetching the OpenGL extensions from the current context.
// This helper abstracts over differences between the desktop OpenGL
// core profile, and OpenGL ES and the compatibility profile. It's
// intended for users of the bindings, not the implementation of the
// bindings themselves. This is a relatively expensive call, so
// callers should cache the result.
GL_EXPORT std::string GetGLExtensionsFromCurrentContext();
// Helper for the GL bindings implementation to understand whether
// glGetString(GL_EXTENSIONS) or glGetStringi(GL_EXTENSIONS, i) will
// be used in the function above.
GL_EXPORT bool WillUseGLGetStringForExtensions();
} // namespace gfx
#endif // UI_GL_GL_IMPLEMENTATION_H_
``` |
```xml
<project
xmlns="path_to_url"
xmlns:xsi="path_to_url"
xsi:schemaLocation="path_to_url path_to_url">
<modelVersion>4.0.0</modelVersion>
<groupId>foo.bar</groupId>
<artifactId>javadoc-image-extraction-with-javadoc</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<version>3.2.10.RELEASE</version>
</dependency>
</dependencies>
</project>
``` |
Helvibis chilensis is a species of comb-footed spider in the family Theridiidae. It is found in Brazil and Chile.
References
Theridiidae
Spiders described in 1884
Spiders of South America |
; ) is the ninth and penultimate episode of the fourth season of the AMC television series Better Call Saul, the spinoff series of Breaking Bad. The episode aired on October 1, 2018, on AMC in the United States. Outside of the United States, the episode premiered on streaming service Netflix in several countries.
Plot
Jimmy McGill and Kim Wexler work a successful scam to replace approved building plans for a Mesa Verde branch in Lubbock, Texas, with plans for a bigger building. On their way home, Jimmy suggests using their skills for more cons, but Kim counsels caution. After Jimmy's hearing, he discovers from the committee secretary that they have denied his reinstatement. Jimmy confronts the chairman, who tells Jimmy some of his answers were perceived as insincere, and he can reapply in a year.
An enraged Jimmy recounts the hearing to Kim, who points out that Jimmy never mentioned Chuck McGill; since their dispute caused Jimmy's suspension, the panel members considered his answers to be disingenuous. Jimmy complains that Kim only "slums" with him when she wants something, but Kim angrily responds that she has been Jimmy's biggest supporter since they first met. That night, Jimmy returns to Kim's apartment and wordlessly starts packing his belongings. When he says he "messed it all up," Kim asks if he still wants to be a lawyer. He says yes, and Kim says she will help.
Lalo Salamanca and Nacho Varga visit Hector Salamanca, who is in a nursing home. Lalo reminds Hector of the time Hector burned down a hotel in Mexico and killed the owner, who had treated Hector disrespectfully. He reveals he kept a souvenir — the front desk concierge bell. He ties the bell to Hector's wheelchair, allowing Hector to communicate more effectively. Lalo takes Nacho to Los Pollos Hermanos so he can meet Gus Fring in person, then asks Nacho to take him to Gus's chicken farm so he can see where the Salamancas receive their drugs after Gus's trucks bring them over the border.
Werner Ziegler's crew blasts the rock preventing construction of the meth lab's elevator, then celebrate because the end of their job is in sight. Werner asks permission to fly to Germany for a weekend with his wife, then come back to finish the work. Instead, Mike Ehrmantraut offers Werner an extra phone call with his wife, which Werner accepts. When Mike receives his report from the on-duty security team the next morning, he notices dead pixels in some monitor displays. He finds that Werner temporarily disabled the cameras, permitting him to move through the warehouse undetected, then cut the padlocks for the doors leading to the roof and escaped by climbing down the building's maintenance ladder.
Production
"Wiedersehen" includes the origin story for Hector Salamanca's bell, an item vitally important to the storylines of both Better Call Saul and Breaking Bad. In addition, it provides backstory details about the character Lalo, who was first named in Breaking Bad, and first appeared in the Better Call Saul episode "Coushatta."
In German, "Wiedersehen" is literally translated as "seeing again", and the expression "auf Wiedersehen" is used to indicate "goodbye" or "farewell". In this episode, Werner's crew have spray painted the word "Wiedersehen" on the rock they need to blast to make room for the elevator in the underground meth lab, in effect saying "goodbye" to the obstacle that has put them behind schedule. "Wiedersehen" can also be understood as a reference to Werner's escape, since he is effectively saying goodbye to his crew, to Mike, and to the job of supervising the meth lab's construction.
Reception
"Wiedersehen" received critical acclaim. On Rotten Tomatoes, it garnered a perfect 100% rating with an average score of 9.5/10 based on 14 reviews. The site's critical consensus is, "A feeling of inevitability permeates 'Wiedersehen', a surprisingly mellow and melancholy penultimate episode beautifully realized by Vince Gilligan and Gennifer Hutchison's creative reunion."
Ratings
"Wiedersehen" was watched by 1.35 million viewers on its first broadcast, earning a 0.5 ratings for viewers between 18 and 49, holding steady with ratings from the previous week.
References
External links
"Wiedersehen" at AMC
Better Call Saul (season 4) episodes
2018 American television episodes
Television episodes directed by Vince Gilligan
Television episodes written by Gennifer Hutchison |
```javascript
import React, { Component } from 'react';
import Editor, { createEditorStateWithText } from '@draft-js-plugins/editor';
import createHashtagPlugin from '@draft-js-plugins/hashtag';
import editorStyles from './editorStyles.module.css';
import hashtagStyles from './hashtagStyles.module.css';
const hashtagPlugin = createHashtagPlugin({ theme: hashtagStyles });
const plugins = [hashtagPlugin];
const text =
'In this editor, we can even apply our own styles #design #theme';
export default class CustomHashtagEditor extends Component {
state = {
editorState: createEditorStateWithText(text),
};
onChange = (editorState) => {
this.setState({
editorState,
});
};
focus = () => {
this.editor.focus();
};
render() {
return (
<div className={editorStyles.editor} onClick={this.focus}>
<Editor
editorState={this.state.editorState}
onChange={this.onChange}
plugins={plugins}
ref={(element) => {
this.editor = element;
}}
/>
</div>
);
}
}
``` |
"The Wolf" is a song by English rock band Mumford & Sons. It was released as the second single from their third studio album Wilder Mind on 9 April 2015 and charted in multiple countries. The official music video for the song was uploaded on 30 June 2015 to the band's Vevo channel on YouTube.
Composition
"The Wolf" is an alternative rock and garage rock song. The song displays the band's change in sound from heavily folk-inspired to more electric instruments. Most noticeably, the use of the banjo is absent from this song as well as Wilder Mind, the album which the song is featured on.
Music video
The official music video for the song, lasting three minutes and fifty-five seconds, was uploaded on 30 June 2015 to the band's Vevo channel on YouTube. Directed by Marcus Haney, the video takes place at the 2015 Bonnaroo Music Festival as the band explores its sights and sounds. It also showcases the headlining performance by the band. In the video the band members are dressed in various costumes; Marcus Mumford can be seen wearing a Robin Hood costume, Ted Dwane in giant chicken costume, Winston Marshall in a wedding dress and Ben Lovett in a fox costume. Actor Ed Helms can also be seen in the video; the actor performed at the festival that year alongside Dierks Bentley.
Critical reception
The single has received positive critical reception. Sputnikmusic labeled the song as a "massive highlight" from Wilder Mind as well as a "beautiful form of alternative rock." Rolling Stone ranked "The Wolf" at number 43 on its annual year-end list to find the best songs of 2015.
Track listing
7" vinyl
Island/Glassnote/Gentlemen of the Road — 4730218
Digital download
Charts and certifications
Weekly charts
Year-end charts
Certifications
Release history
References
2014 songs
2015 singles
Glassnote Records singles
Island Records singles
Mumford & Sons songs
Song recordings produced by James Ford (musician)
Songs written by Ben Lovett (British musician)
Songs written by Marcus Mumford
Songs written by Ted Dwane
Songs written by Winston Marshall |
Snow Lake may refer to:
Lakes
Pakistan
Snow Lake (Pakistan) or Lukpe Lawo, a glacial basin
United States
Snow Lake (Idaho), an alpine lake
Snow Lake (Nevada), a glacial tarn
Snow Lake (New Mexico), a small reservoir
Snow Lake (King County, Washington)
Snow Lake (Mount Rainier), in Lewis County, Washington
Snow Lakes system in the Alpine Lakes Wilderness, Chelan County, Washington
Communities
Snow Lake, Manitoba, Canada
Snow Lake, Arkansas, United States
Snow Lake, Indiana, United States
Other uses
Snow Lake Airport, serving Snow Lake, Manitoba
Snow Lake Water Aerodrome, serving Snow Lake, Manitoba
Snow Lake Peak, in Nevada |
```javascript
let Steps=[];
if ($("#linkedInFeeds").find(".noLinkedInDiv").length === 0) {
Steps = [
{
intro: 'Welcome to LinkedIn Pages feeds Page: This page shows the posts(feeds) of LinkedIn page accounts.'
},
{
element: document.querySelector('.selectAccountsDiv'),
intro: 'From here you can select LinkedIn Page accounts of which You want to look for Feeds.'
},
{
element: document.querySelector('.rating-css'),
intro: 'From here you can change the rating of your Social Account.'
},
{
element: document.querySelector('.reSocioButtonClass'),
intro: 'From here you can Share Post to Multiple Social Media Accounts.'
},
{
element: document.querySelector('.postLinkClassDiv'),
intro: 'Clicking here will redirect to the original post on the LinkedIn page.'
},
];
}else{
Steps = [
{
intro: 'Welcome to LinkedIn Pages feeds Page: This page shows the posts(feeds) of LinkedIn page accounts.'
},
];
}
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.