text
stringlengths 1
22.8M
|
|---|
Peter Price (born 17 August 1949 in Wrexham) is a Welsh former professional footballer (soccer player) who played professionally in England during the 1960s and 1970s.
Career
Early career
Price was one of that generation who bridged the gap between terminological eras, beginning his career as an inside forward and ending it as a striker despite playing a similar role throughout.
He began his career as a youth at Liverpool, signing a full-time professional contract in 1966.
Peterborough United
In 1968, having failed to break into the Liverpool first team, he joined Peterborough United, for whom he scored 62 goals in 119 appearances.
Portsmouth
In 1972, Price signed for Portsmouth. Hampered by a back injury, he struggled to establish himself in the side, eventually leaving Fratton Park to return to his former club two years later.
Barnsley
He ended his career at Barnsley, making his final appearance in the spring of 1978.
International career
While at Peterborough, Price played for the Welsh under-23 side, scoring the winning goal against Scotland at Swansea City's Vetch Field ground.
References
1949 births
Living people
Footballers from Wrexham
Welsh men's footballers
Liverpool F.C. players
Peterborough United F.C. players
Barnsley F.C. players
Wales men's under-23 international footballers
Men's association football forwards
|
```c++
// The following code fragment is taken from CMyApp::InitInstance.
// CMyApp is derived from CWinApp.
// The main window has been initialized, so show and update it
// using the nCmdShow parameter passed to the application when it
// was first launched.
// pMainFrame is the main MDI frame window of our app and is derived
// from CMDIFrameWnd.
pMainFrame->ShowWindow(m_nCmdShow);
pMainFrame->UpdateWindow();
```
|
The 2023–24 Colorado Buffaloes men's basketball team represents the University of Colorado Boulder in the 2023–24 NCAA Division I men's basketball season. They are led by head coach Tad Boyle in his fourteenth season at Colorado. The Buffaloes play their home games at CU Events Center in Boulder, Colorado in their last season as members of the Pac-12 Conference before they rejoin the Big 12 Conference in the 2024–25 season.
Previous season
The Buffaloes finished the season 18–17, 8–12 in Pac-12 Play. They defeated Washington in the first round of the Pac-12 Tournament before losing in the quarterfinals to UCLA. They received an at-large bid to the National Invitation Tournament where they defeated Seton Hall in the First Round before losing to Utah Valley in the Second Round.
Off-season
Departures
Incoming transfers
2023 Recruiting class
Roster
Schedule and results
|-
!colspan=12 style=| Exhibition
|-
!colspan=12 style=| Non-conference regular season
|-
!colspan=12 style=| Pac-12 regular season
|-
!colspan=12 style=| Pac–12 Tournament
Game summaries
MSU Denver (exhibition)
Rankings
*AP does not release post-NCAA Tournament rankings
References
Colorado
Colorado Buffaloes men's basketball seasons
Colorado Buffaloes
Colorado Buffaloes
|
```php
<?php
declare(strict_types=1);
return [
[2, '2.5, 1'],
[-2, '-2.5, -2'],
[-4, '-2.5, 2'],
[0.0, '0.0, 1'],
['#NUM!', '2.5, -2'],
['#DIV/0!', '123.456, 0'],
[1.5, '1.5, 0.1'],
[0.23, '0.234, 0.01'],
['exception', '123.456'],
['#VALUE!', '"ABC", 1'],
[15, '"17", "3"'],
[16, '19, 4'],
[0, ',1'],
[0, 'false,1'],
[1, 'true,1'],
['#VALUE!', '"", 1'],
[1, 'A2, 1'],
[2, 'A3, 1'],
[-4, 'A4, 1'],
[-6, 'A5, 1'],
];
```
|
```xml
/*
*
* See the LICENSE file at the top-level directory of this distribution
* for licensing information.
*
* Unless otherwise agreed in a custom licensing agreement with the Lisk Foundation,
* no part of this software, including this file, may be copied, modified,
* propagated, or distributed except according to the terms contained in the
* LICENSE file.
*
* Removal or modification of this copyright notice is prohibited.
*/
import { BlockAsset, BlockAssets } from '@liskhq/lisk-chain';
import { codec } from '@liskhq/lisk-codec';
import * as cryptography from '@liskhq/lisk-cryptography';
import { objects } from '@liskhq/lisk-utils';
import { RandomMethod } from '../../../../src/modules/random/method';
import { SEED_LENGTH } from '../../../../src/modules/random/constants';
import { blockHeaderAssetRandomModule } from '../../../../src/modules/random/schemas';
import { bitwiseXOR } from '../../../../src/modules/random/utils';
import { MethodContext } from '../../../../src/state_machine';
import { createTransientMethodContext } from '../../../../src/testing';
import * as genesisValidators from '../../../fixtures/genesis_validators.json';
import { testCases } from '../../../fixtures/pos_random_seed_generation/pos_random_seed_generation_other_rounds.json';
import { RandomModule } from '../../../../src/modules/random';
import {
ValidatorRevealsStore,
ValidatorSeedReveal,
} from '../../../../src/modules/random/stores/validator_reveals';
const strippedHashOfIntegerBuffer = (num: number) =>
cryptography.utils.hash(cryptography.utils.intToBuffer(num, 4)).subarray(0, SEED_LENGTH);
describe('RandomModuleMethod', () => {
let randomMethod: RandomMethod;
let context: MethodContext;
let randomStore: ValidatorRevealsStore;
const randomModule = new RandomModule();
const EMPTY_BYTES = Buffer.alloc(0);
describe('isSeedRevealValid', () => {
const twoRoundsValidators: ValidatorSeedReveal[] = [];
const twoRoundsValidatorsHashes: { [key: string]: Buffer[] } = {};
for (const generator of testCases[0].input.blocks) {
const generatorAddress = cryptography.address.getAddressFromPublicKey(
Buffer.from(generator.generatorPublicKey, 'hex'),
);
const seedReveal = Buffer.from(generator.asset.seedReveal, 'hex');
twoRoundsValidators.push({
generatorAddress,
seedReveal,
height: generator.height,
valid: true,
});
if (!twoRoundsValidatorsHashes[generatorAddress.toString('hex')]) {
twoRoundsValidatorsHashes[generatorAddress.toString('hex')] = [];
}
twoRoundsValidatorsHashes[generatorAddress.toString('hex')].push(seedReveal);
}
beforeEach(async () => {
randomMethod = new RandomMethod(randomModule.stores, randomModule.events, randomModule.name);
context = createTransientMethodContext({});
randomStore = randomModule.stores.get(ValidatorRevealsStore);
await randomStore.set(context, EMPTY_BYTES, {
validatorReveals: twoRoundsValidators.slice(0, 103),
});
});
it('should throw error when asset is undefined', async () => {
// Arrange
const validatorAddress = cryptography.address.getAddressFromPublicKey(
Buffer.from(testCases[0].input.blocks[0].generatorPublicKey, 'hex'),
);
const blockAsset: BlockAsset = {
module: randomModule.name,
data: undefined as any,
};
// Act & Assert
await expect(
randomMethod.isSeedRevealValid(context, validatorAddress, new BlockAssets([blockAsset])),
).rejects.toThrow('Block asset is missing.');
});
it('should return true if the last revealed seed by generatorAddress in validatorReveals array is equal to the hash of seedReveal', async () => {
for (const [address, hashes] of Object.entries(twoRoundsValidatorsHashes)) {
// Arrange
const blockAsset: BlockAsset = {
module: randomModule.name,
data: codec.encode(blockHeaderAssetRandomModule, { seedReveal: hashes[1] }),
};
// Act
const isValid = await randomMethod.isSeedRevealValid(
context,
Buffer.from(address, 'hex'),
new BlockAssets([blockAsset]),
);
// Assert
expect(isValid).toBe(true);
}
});
it('should return true if no last seed reveal found', async () => {
// Arrange
await randomStore.set(context, EMPTY_BYTES, { validatorReveals: [] });
for (const [address, hashes] of Object.entries(twoRoundsValidatorsHashes)) {
const blockAsset: BlockAsset = {
module: randomModule.name,
data: codec.encode(blockHeaderAssetRandomModule, { seedReveal: hashes[1] }),
};
// Act
const isValid = await randomMethod.isSeedRevealValid(
context,
Buffer.from(address, 'hex'),
new BlockAssets([blockAsset]),
);
// Assert
expect(isValid).toBe(true);
}
});
it('should return false if there is a last revealed seed by generatorAddress in validatorReveals array but it is not equal to the hash of seedReveal', async () => {
await randomStore.set(context, EMPTY_BYTES, { validatorReveals: twoRoundsValidators });
for (const [address, hashes] of Object.entries(twoRoundsValidatorsHashes)) {
// Arrange
const blockAsset: BlockAsset = {
module: randomModule.name,
data: codec.encode(blockHeaderAssetRandomModule, { seedReveal: hashes[1] }),
};
// Act
const isValid = await randomMethod.isSeedRevealValid(
context,
Buffer.from(address, 'hex'),
new BlockAssets([blockAsset]),
);
// Assert
expect(isValid).toBe(false);
}
});
it('should return true if generatorAddress is not present in any element of validatorReveals array', async () => {
// Arrange
const { generatorAddress } = twoRoundsValidators[5];
const twoRoundsValidatorsClone1 = objects.cloneDeep(twoRoundsValidators);
twoRoundsValidatorsClone1[5].generatorAddress = Buffer.alloc(0);
await randomStore.set(context, EMPTY_BYTES, {
validatorReveals: twoRoundsValidatorsClone1.slice(0, 103),
});
const hashes = twoRoundsValidatorsHashes[generatorAddress.toString('hex')];
const blockAsset: BlockAsset = {
module: randomModule.name,
data: codec.encode(blockHeaderAssetRandomModule, { seedReveal: hashes[1] }),
};
// Act
const isValid = await randomMethod.isSeedRevealValid(
context,
generatorAddress,
new BlockAssets([blockAsset]),
);
// Assert
expect(isValid).toBe(true);
});
it('should return false if seedreveal is not a 16-bytes value', async () => {
// Arrange
const { generatorAddress } = twoRoundsValidators[5];
const twoRoundsValidatorsClone2 = twoRoundsValidators;
twoRoundsValidatorsClone2[5].seedReveal = cryptography.utils.getRandomBytes(17);
await randomStore.set(context, EMPTY_BYTES, {
validatorReveals: twoRoundsValidatorsClone2.slice(0, 103),
});
const hashes = twoRoundsValidatorsHashes[generatorAddress.toString('hex')];
const blockAsset: BlockAsset = {
module: randomModule.name,
data: codec.encode(blockHeaderAssetRandomModule, { seedReveal: hashes[1] }),
};
// Act
const isValid = await randomMethod.isSeedRevealValid(
context,
generatorAddress,
new BlockAssets([blockAsset]),
);
// Assert
expect(isValid).toBe(false);
});
it('should return false if generatorAddress is not a 20-byte input', async () => {
// Arrange
const generatorAddress = cryptography.utils.getRandomBytes(21);
const twoRoundsValidatorsClone3 = objects.cloneDeep(twoRoundsValidators);
twoRoundsValidatorsClone3[5].generatorAddress = generatorAddress;
await randomStore.set(context, EMPTY_BYTES, {
validatorReveals: twoRoundsValidatorsClone3.slice(0, 103),
});
const hashes =
twoRoundsValidatorsHashes[twoRoundsValidators[5].generatorAddress.toString('hex')];
const blockAsset: BlockAsset = {
module: randomModule.name,
data: codec.encode(blockHeaderAssetRandomModule, { seedReveal: hashes[1] }),
};
// Act
const isValid = await randomMethod.isSeedRevealValid(
context,
generatorAddress,
new BlockAssets([blockAsset]),
);
// Assert
expect(isValid).toBe(false);
});
});
describe('getRandomBytes', () => {
const validatorsData = [
{
generatorAddress: Buffer.from(genesisValidators.validators[0].address, 'hex'),
seedReveal: Buffer.from(genesisValidators.validators[0].hashOnion.hashes[1], 'hex'),
height: 11,
valid: true,
},
{
generatorAddress: Buffer.from(genesisValidators.validators[0].address, 'hex'),
seedReveal: Buffer.from(genesisValidators.validators[0].hashOnion.hashes[2], 'hex'),
height: 13,
valid: true,
},
{
generatorAddress: Buffer.from(genesisValidators.validators[0].address, 'hex'),
seedReveal: Buffer.from(genesisValidators.validators[0].hashOnion.hashes[3], 'hex'),
height: 17,
valid: true,
},
{
generatorAddress: Buffer.from(genesisValidators.validators[0].address, 'hex'),
seedReveal: Buffer.from(genesisValidators.validators[0].hashOnion.hashes[4], 'hex'),
height: 19,
valid: true,
},
{
generatorAddress: Buffer.from(genesisValidators.validators[1].address, 'hex'),
seedReveal: Buffer.from(genesisValidators.validators[1].hashOnion.hashes[1], 'hex'),
height: 14,
valid: true,
},
{
generatorAddress: Buffer.from(genesisValidators.validators[2].address, 'hex'),
seedReveal: Buffer.from(genesisValidators.validators[2].hashOnion.hashes[1], 'hex'),
height: 15,
valid: false,
},
];
beforeEach(async () => {
randomMethod = new RandomMethod(randomModule.stores, randomModule.events, randomModule.name);
context = createTransientMethodContext({});
randomStore = randomModule.stores.get(ValidatorRevealsStore);
await randomStore.set(context, EMPTY_BYTES, { validatorReveals: validatorsData });
});
it('should throw error when height is negative', async () => {
const height = -11;
const numberOfSeeds = 2;
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).rejects.toThrow(
'Height or number of seeds cannot be negative.',
);
});
it('should throw error when numberOfSeeds is negative', async () => {
const height = 11;
const numberOfSeeds = -2;
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).rejects.toThrow(
'Height or number of seeds cannot be negative.',
);
});
it('should throw error if for every seedObject element in validatorReveals height > seedObject.height', async () => {
const height = 35;
const numberOfSeeds = 5;
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).rejects.toThrow(
'Height is in the future.',
);
});
it('should throw error when height is non integer input', async () => {
const height = 5.1;
const numberOfSeeds = 2;
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).rejects.toThrow(
'Height or number of seeds cannot be non integer.',
);
});
it('should throw error when number of seeds is non integer input', async () => {
const height = 5;
const numberOfSeeds = 0.3;
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).rejects.toThrow(
'Height or number of seeds cannot be non integer.',
);
});
it('should return XOR random bytes as 16 bytes value for height=11, numberOfSeeds=3', async () => {
const height = 11;
const numberOfSeeds = 3;
// Create a buffer from height + numberOfSeeds
const randomSeed = strippedHashOfIntegerBuffer(height + numberOfSeeds);
const hashesExpected = [
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[1], 'hex'),
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[2], 'hex'),
];
// Do XOR of randomSeed with hashes of seed reveal with height >= randomStoreValidator.height >= height + numberOfSeeds
const xorExpected = bitwiseXOR([randomSeed, ...hashesExpected]);
expect(xorExpected).toHaveLength(16);
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).resolves.toEqual(
xorExpected,
);
});
it('should return XOR random bytes for height=11, numberOfSeeds=4', async () => {
const height = 11;
const numberOfSeeds = 4;
// Create a buffer from height + numberOfSeeds
const randomSeed = strippedHashOfIntegerBuffer(height + numberOfSeeds);
const hashesExpected = [
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[1], 'hex'),
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[2], 'hex'),
Buffer.from(genesisValidators.validators[1].hashOnion.hashes[1], 'hex'),
];
// Do XOR of randomSeed with hashes of seed reveal with height >= randomStoreValidator.height >= height + numberOfSeeds
const xorExpected = bitwiseXOR([randomSeed, ...hashesExpected]);
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).resolves.toEqual(
xorExpected,
);
});
it('should return XOR random bytes for height=11, numberOfSeeds=5 excluding invalid seed reveal', async () => {
const height = 11;
const numberOfSeeds = 5;
// Create a buffer from height + numberOfSeeds
const randomSeed = strippedHashOfIntegerBuffer(height + numberOfSeeds);
const hashesExpected = [
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[1], 'hex'),
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[2], 'hex'),
Buffer.from(genesisValidators.validators[1].hashOnion.hashes[1], 'hex'),
];
// Do XOR of randomSeed with hashes of seed reveal with height >= randomStoreValidator.height >= height + numberOfSeeds
const xorExpected = bitwiseXOR([
bitwiseXOR([bitwiseXOR([randomSeed, hashesExpected[0]]), hashesExpected[1]]),
hashesExpected[2],
]);
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).resolves.toEqual(
xorExpected,
);
});
it('should return XOR random bytes for height=8, numberOfSeeds=4', async () => {
const height = 8;
const numberOfSeeds = 4;
// Create a buffer from height + numberOfSeeds
const randomSeed = strippedHashOfIntegerBuffer(height + numberOfSeeds);
const hashesExpected = [
Buffer.from(genesisValidators.validators[0].hashOnion.hashes[1], 'hex'),
];
// Do XOR of randomSeed with hashes of seed reveal with height >= randomStoreValidator.height >= height + numberOfSeeds
const xorExpected = bitwiseXOR([randomSeed, ...hashesExpected]);
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).resolves.toEqual(
xorExpected,
);
});
it('should return initial random bytes for height=7, numberOfSeeds=3', async () => {
const height = 7;
const numberOfSeeds = 3;
// Create a buffer from height + numberOfSeeds
const randomSeed = strippedHashOfIntegerBuffer(height + numberOfSeeds);
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).resolves.toEqual(
randomSeed,
);
});
it('should throw error for height=20, numberOfSeeds=1', async () => {
const height = 20;
const numberOfSeeds = 1;
await expect(randomMethod.getRandomBytes(context, height, numberOfSeeds)).rejects.toThrow(
'Height is in the future.',
);
});
});
describe('getRandomBytes from protocol specs', () => {
describe.each([...testCases].map(testCase => [testCase.description, testCase]))(
'%s',
(_description, testCase) => {
// Arrange
const { config, input, output } = testCase as any;
const validators: ValidatorSeedReveal[] = [];
for (const generator of input.blocks) {
const generatorAddress = cryptography.address.getAddressFromPublicKey(
Buffer.from(generator.generatorPublicKey, 'hex'),
);
const seedReveal = Buffer.from(generator.asset.seedReveal, 'hex');
validators.push({
generatorAddress,
seedReveal,
height: generator.height,
valid: true,
});
}
beforeEach(async () => {
randomMethod = new RandomMethod(
randomModule.stores,
randomModule.events,
randomModule.name,
);
context = createTransientMethodContext({});
randomStore = randomModule.stores.get(ValidatorRevealsStore);
await randomStore.set(context, EMPTY_BYTES, { validatorReveals: validators });
});
it('should generate correct random seeds', async () => {
// Arrange
// For randomSeed 1
const round = Math.floor(
input.blocks[input.blocks.length - 1].height / config.blocksPerRound,
);
const middleThreshold = Math.floor(config.blocksPerRound / 2);
const startOfRound = config.blocksPerRound * (round - 1) + 1;
// To validate seed reveal of any block in the last round we have to check till second last round that doesn't exist for last round
const heightForSeed1 = startOfRound - (round === 2 ? 0 : middleThreshold);
// For randomSeed 2
const endOfLastRound = startOfRound - 1;
const startOfLastRound = endOfLastRound - config.blocksPerRound + 1;
// Act
const randomSeed1 = await randomMethod.getRandomBytes(
context,
heightForSeed1,
round === 2 ? middleThreshold : middleThreshold * 2,
);
// There is previous round for last round when round is 2
const randomSeed2 =
round === 2
? strippedHashOfIntegerBuffer(endOfLastRound)
: await randomMethod.getRandomBytes(context, startOfLastRound, middleThreshold * 2);
// Assert
expect(randomSeed1.toString('hex')).toEqual(output.randomSeed1);
expect(randomSeed2.toString('hex')).toEqual(output.randomSeed2);
});
},
);
});
});
```
|
```java
package com.linchaolong.apktoolplus.module.settings;
import javafx.fxml.FXML;
import javafx.fxml.Initializable;
import javafx.scene.control.Button;
import javafx.scene.control.TextField;
import com.linchaolong.apktoolplus.base.Activity;
import com.linchaolong.apktoolplus.Config;
import com.linchaolong.apktoolplus.ui.DirectorySelecter;
import com.linchaolong.apktoolplus.ui.FileSelecter;
import com.linchaolong.apktoolplus.utils.ViewUtils;
import com.linchaolong.apktoolplus.utils.FileHelper;
import java.io.File;
import java.net.URL;
import java.util.ResourceBundle;
/**
* Created by linchaolong on 2016/3/29.
*/
public class CommonSettingsActivity extends Activity implements Initializable{
public static final String TAG = CommonSettingsActivity.class.getSimpleName();
@FXML
TextField textFieldSublimePath;
@FXML
Button btnSublimeSelect;
@FXML
TextField textFieldAppOutPath;
@FXML
TextField textFieldCmdParams;
@Override
public void initialize(URL location, ResourceBundle resources) {
//
review();
//
ViewUtils.listenerInputAndSave(textFieldCmdParams,Config.kSublimeCmdParams);
}
public void selectSublime(){
File lastDir = Config.getDir(Config.kSublimePath);
File sublimeFile = FileSelecter.create(btnSublimeSelect.getParent().getScene().getWindow())
.addFilter("exe")
.addFilter("*")
.setInitDir(lastDir)
.setTitle("sublime")
.showDialog();
if(sublimeFile != null){
textFieldSublimePath.setText(sublimeFile.getPath());
Config.set(Config.kSublimePath,sublimeFile.getPath());
}
}
/**
* EasySDK
*/
public void selectAppOut(){
File lastDir = Config.getDir(Config.kAppOutputDir);
File dir = DirectorySelecter.create(btnSublimeSelect.getParent().getScene().getWindow())
.setInitDir(lastDir)
.setTitle("")
.showDialog();
if(FileHelper.exists(dir)){
textFieldAppOutPath.setText(dir.getPath());
Config.set(Config.kAppOutputDir,dir.getPath());
}
}
/**
*
*/
private void review() {
// sublime
ViewUtils.review(textFieldSublimePath,Config.kSublimePath);
//
ViewUtils.review(textFieldCmdParams,Config.kSublimeCmdParams);
// ApkToolPlus
ViewUtils.review(textFieldAppOutPath,Config.kAppOutputDir);
}
}
```
|
```python
"""Provide a compatibility layer for requests.auth.HTTPDigestAuth."""
import requests
class _ThreadingDescriptor(object):
def __init__(self, prop, default):
self.prop = prop
self.default = default
def __get__(self, obj, objtype=None):
return getattr(obj._thread_local, self.prop, self.default)
def __set__(self, obj, value):
setattr(obj._thread_local, self.prop, value)
class _HTTPDigestAuth(requests.auth.HTTPDigestAuth):
init = _ThreadingDescriptor('init', True)
last_nonce = _ThreadingDescriptor('last_nonce', '')
nonce_count = _ThreadingDescriptor('nonce_count', 0)
chal = _ThreadingDescriptor('chal', {})
pos = _ThreadingDescriptor('pos', None)
num_401_calls = _ThreadingDescriptor('num_401_calls', 1)
if requests.__build__ < 0x020800:
HTTPDigestAuth = requests.auth.HTTPDigestAuth
else:
HTTPDigestAuth = _HTTPDigestAuth
```
|
```c++
//
// I am making my contributions/submissions to this project solely in my
// personal capacity and am not conveying any rights to any intellectual
// property of any third parties.
#include <pch.h>
#include <jet/bcc_lattice_point_generator.h>
namespace jet {
void BccLatticePointGenerator::forEachPoint(
const BoundingBox3D& boundingBox,
double spacing,
const std::function<bool(const Vector3D&)>& callback) const {
double halfSpacing = spacing / 2.0;
double boxWidth = boundingBox.width();
double boxHeight = boundingBox.height();
double boxDepth = boundingBox.depth();
Vector3D position;
bool hasOffset = false;
bool shouldQuit = false;
for (int k = 0; k * halfSpacing <= boxDepth && !shouldQuit; ++k) {
position.z = k * halfSpacing + boundingBox.lowerCorner.z;
double offset = (hasOffset) ? halfSpacing : 0.0;
for (int j = 0; j * spacing + offset <= boxHeight && !shouldQuit; ++j) {
position.y = j * spacing + offset + boundingBox.lowerCorner.y;
for (int i = 0; i * spacing + offset <= boxWidth; ++i) {
position.x = i * spacing + offset + boundingBox.lowerCorner.x;
if (!callback(position)) {
shouldQuit = true;
break;
}
}
}
hasOffset = !hasOffset;
}
}
} // namespace jet
```
|
The flag of the Republic of Dagestan (; ; ; ) was adopted after the transformation of the Dagestan ASSR into the Republic of Dagestan within the Russian Federation. The flag was formally adopted on 26 February 1994. It features a horizontal tricolor of green (for Islam), blue (for the Caspian Sea), and red (for courage and fidelity). On 19 November 2003 the proportion of the flag was changed from the original 1:2 to 2:3, and the middle stripe from light blue to blue.
Colours scheme
The official colours scheme was declared in 19 November 2003.
Historical flags
Following its formation from parts of the Mountainous Republic of the Northern Caucasus in 1921, the Dagestan Autonomous Soviet Socialist Republic had several flags of the standard ASSR, first red flags defaced with the initials of the ASSR name (i.e. "") and then a RSFSR flag defaced with the same. With the fall of the Soviet Union, Dagestan dropped the (for , "autonomous") from its flag and the inscription read simply "". A flag with horizontal blue and yellow stripes may have been used briefly in 1993 and 1994 until a variation of the current horizontal tricolor was adopted in 1994.
Other flags
Several peoples in Dagestan have devised their own ethnic flags:
See also
Flag of Russia
Coat of arms of Dagestan
References
Culture of Dagestan
Dagestan
Dagestan
Dagestan
|
Opharus tricyphoides is a moth of the family Erebidae first described by Walter Rothschild in 1909. It is found in Brazil, Peru and Costa Rica.
References
Moths described in 1909
Opharus
Moths of Central America
Moths of South America
|
```javascript
Check if an argument is a number
Deleting properties
Precision
Closures
Scope and strict mode
```
|
The NEPAD African Western and Southern Networks of Centres of Excellence in Water Sciences are international collaborations between teams of researchers working in different parts of southern and western Africa on the economic development of local water resources. The southern network of nine centres is coordinated from the University of Stellenbosch in Cape Town, South Africa, the western network of five centres from the University of Cheikh Anta Diop in Dakar, Senegal.
Definition of Centre of Excellence by the United Nations Environment Programme (UNEP)
Centres of Excellence are physical or virtual centres focused on specific issues. They concentrate on existing capabilities and resources to encourage collaboration across disciplines and across organisations on long term programmes and projects directly relevant to human needs and aspirations. By definition, Centres of Excellence are widely known for their work.
Framework of the initiative
The New Partnership for Africa's Development (NEPAD) "explicitly recognizes that Africa’s economic renewal and sustainable development will not be achieved without effective and efficient research and development (R&D) institutions.” NEPAD, therefore, launched a programme specifically to identify and reinforce R&D capacities in Africa through building regional networks of Centres of Excellence in water sciences. The program is in line with Africa's Science and Technology Consolidated Plan of Action (CPA). The specific goals for water sciences (p. 28 of the CPA) are:
to improve conservation and utilization of the continent's water resources;
to improve the quality and the quantity of water available to rural and urban households;
to strengthen national and regional capacities for water resources management and reduce impacts of water related disasters; and
to enlarge the range of technologies for water supply and improve access to affordable quality water.
Calls of interest were launched and proposals evaluated in both the Southern African and Western African regions to identify and appoint Centres of Excellences in water sciences. Two regional networks were set up in 2009. The existing networks may be further expanded.
Southern African Centres of Excellence network
Coordinator of the network is the University of Stellenbosch, South Africa.
Actual members of the southern African network
Stellenbosch University (South Africa)
International Center for Water Economics and Governance in Africa (Mozambique)
KwaZulu-Natal University (South Africa)
Western Cape University (South Africa)
University of Malawi (Malawi)
University of Zambia (Zambia)
University of Botswana (Botswana)
Council for Scientific and Industrial Research (South Africa)
Polytechnic of Namibia (Namibia)
Western African Centres of Excellence network
The coordinator is the Doctoral School on Water of the University of Cheikh Anta Diop, Senegal.
Actual members of the western African network
University of Cheikh Anta Diop (Senegal)
International Institute for Water and Environmental Engineering (Burkina)
University of Benin (Nigeria)
National Water Resources Institute (Nigeria)
Kwame Nkrumah University of Science and Technology (Ghana)
References
External links
Southern Network
Western Network
Scientific organizations based in Africa
|
```kotlin
/*
*/
package splitties.internal.test
actual typealias RunWith = org.junit.runner.RunWith
actual typealias Runner = org.junit.runner.Runner
```
|
Nové Syrovice is a municipality and village in Třebíč District in the Vysočina Region of the Czech Republic. It has about 900 inhabitants.
Nové Syrovice lies approximately south of Třebíč, south of Jihlava, and south-east of Prague.
Administrative parts
The village of Krnčice is an administrative part of Nové Syrovice.
Notable people
Johann Georg Grasel (1790–1818), robber and murderer
References
Villages in Třebíč District
|
```javascript
Function constructor vs. function declaration vs. function expression
`.bind()`
IIFE pattern
Method chaining
Check if a document is done loading
```
|
Julius Hammer was the father of Armand Hammer and the great-great-grandfather of Armie Hammer. He emigrated from what was then the Russian Empire.
References
Year of birth missing
Place of birth missing
Year of death missing
Place of death missing
Emigrants from the Russian Empire
Nationality missing
|
They Never Say When is a 1944 thriller novel by the British writer Peter Cheyney. It is the sixth in his series of novels featuring the London private detective Slim Callaghan, a British version of the increasingly popular hardboiled American detectives.
Synopsis
Callaghan is hired by a Mrs Paula Denys who claims that she paid a to steal a priceless coronet from her husband's safe. Now he refuses to hand over the stolen and is instead trying to blackmail her.
References
Bibliography
Magill, Frank Northen. Critical Survey of Mystery and Detective Fiction: Authors, Volume 1. Salem Press, 1988.
Reilly, John M. Twentieth Century Crime & Mystery Writers. Springer, 2015.
Server, Lee. Encyclopedia of Pulp Fiction Writers. Infobase Publishing, 2014.
1944 British novels
Novels by Peter Cheyney
British thriller novels
Novels set in London
British crime novels
William Collins, Sons books
|
The president of Marche is the supreme authority of Marche. It was originally appointed by the Regional Council of Marche.
Election
Originally appointed by the Regional Council of Marche, since 1995 de facto and 2000 de jure, he is elected by popular vote every five years under universal suffrage: the candidate who receives a plurality of votes, is elected.
His office is connected to the Regional Council, which is elected contextually: a majority bonus hugely increases the number of his supporters in the assembly. The council and the president are linked by an alleged relationship of confidence: if the president resigns or he is dismissed by the council, a snap election is called for both the legislative and the executive offices, because in no case the two bodies can be chosen separately.
The popular election of the president and the relationship of confidence between him and the legislature, allow to identify the Lombard model of government as a particular form of semi-presidential system.
Powers
The president of Marche promulgates regional laws and regulations. He can receive special administrative functions by the national government. The president is one of the eighty members of the Regional Council and, in this capacity, he can propose new laws.
The president appoints and dismiss the Regional Cabinet (called Giunta Regionale in Italian). The Cabinet is composed by no more than sixteen regional assessors (assessori, literally "aldermen") who can be members of the Council at the same time. Assessors should not be confused with the ministers: according to Italian administrative law, assessors only receive delegations from the president to rule a bureau or an agency, the Region being a single legal person, not divided in ministries. One assessor can be appointed vice president. The president can also appoint four under-secretaries (sottosegretari) to help the president in his functions.
The Regional Cabinet prepares the budget, appoints the boards of public regional agencies and companies, manages assets, develops projects of governance, and resorts to the Constitutional Court of Italy if it thinks that a national law may violate regional powers. The president and the Cabinet are two different authorities of the Region: in matters within its competence, the Cabinet has the power to vote to give its approval.
See also
List of presidents of Marche
Regional Council of Marche
References
Politics of le Marche
Marche
|
Amy Stephens is a Principal in Public Policy and Regulation Practice at Dentons, a multinational law firm; previously, she served as Colorado House Majority Leader and House Minority Caucus Chairman in the Colorado House of Representatives.
Biography
Stephens attended the University of California at Los Angeles and then California State University Fullerton, earning a bachelor's degree in communications. From 1991 to 2001, she worked as a public policy and youth culture specialist for the Christian ministry Focus on the Family. A sexual risk avoidance curriculum written by Stephens, No Apologies, has been translated into over a dozen languages. After leaving Focus on the Family, Stephens founded the consulting firm Fresh Ideas Communication & Consulting, assisting non-profit and faith-based organizations with communication, organization, and development issues. She also served as a panel expert on federal grant review committees for the federal Department of Health & Human Services.
Before running for the legislature herself, Stephens was a veteran of numerous Republican campaigns, including those of Colorado Governor Bill Owens, 4th Judicial District Attorney John Newsome, El Paso County Commissioner Wayne Williams, and U.S. President George W. Bush. She served as a member of the El Paso County Republican Committee and as a delegate to the 1996 and 2004 Republican National Conventions. Governor Owens also appointed Stephens to the Governor's Commission on the Welfare of Children.
Elected to the Colorado House of Representatives as a Republican in 2006, Stephens represented House District 20, which covers northern El Paso County, Colorado, including portions of Colorado Springs and the areas surrounding the United States Air Force Academy. She served as the House Majority Leader during the two years of Republican control of the House from 2010 to 2012. Following redistricting, Stephens was elected as the representative for Colorado's 19th House District. She sought the Republican nomination to challenge then-U.S. Senator Mark Udall in 2014, but withdrew from the race on February 27, 2014. In 2015 she was hired by Dentons, the world's largest global law firm, to lead their Denver Government Affairs practice.
Stephens is married; she and her husband, Ron (former Town of Monument Trustee), have one son, Nicholas.
Political career
2006 election
In 2006, Stephens won a 3:1 victory over Democratic opponent Jan Hejtmanek in an overwhelmingly Republican district. During her campaign, Stephens identified infrastructure issues, including water, as among her major legislative concerns.
2007 legislative session
In the 2007 session of the state legislature, Stephens sat on the House Judiciary Committee and was the ranking Republican on the House Business & Labor Affairs Committee. Four bills introduced by Rep. Stephens were passed by the General Assembly, most prominently a measure that would prohibit criminal charges against illegal immigrants from being dismissed without their deportation. In November 2007, upon Rep. Bill Cadman's appointment to the Colorado Senate, the first-term legislator was elected to succeed him as House Minority Caucus
Chair.
2008 legislative session
In the 2008 session of the Colorado General Assembly, Stephens sat on the House Business Affairs and Labor Committee and the House Judiciary Committee.
Stephens sponsored a bill to tax in-room pay-per-view movies sold by hotels to fund child advocacy centers; after facing opposition from the hotel industry, Stephen asked for the bill to be killed in committee. Stephens also sponsored a bill, passed by the General Assembly, to streamline the teaching licensure application process for military spouses, and sponsored another bill to provide unemployment benefits to military spouses forced to relocate out of state. She sponsored successful legislation to require hospitals to publicly publish charges for common medical procedures. Stephens led Republican opposition to the 2008 state budget, criticizing it for excessive spending.
2008 election
Stephens again faced Democrat Jan Hejtmanek in the November 2008 legislative election.
In September 2008, Stephens was named to the "Palin Truth Squad," representatives of the McCain-Palin presidential campaign tasked with countering alleged distortions concerning the record of Republican vice-presidential candidate Sarah Palin. In that capacity, she made a number of media statements in support of Palin during the 2008 presidential campaign, spoke at an October rally in Colorado Springs featuring Palin, and delivered the invocation at a Denver rally featuring John McCain. In October, Stephens, with other Republican legislators, participated in a statewide "Save, Don't Spend" RV Tour critical of Democratic policies.
After winning re-election with 76 percent of the popular vote, Stephens was also re-elected Minority Caucus Chair by House Republicans, fending off a challenge for the post from Rep. Ellen Roberts.
2009 legislative session
With Democratic Rep. Joe Rice, Stephens sponsored legislation allowing health insurance providers to offer discounts for participation in wellness programs. Responding to a deal between labor and business leaders to remove several statewide referendums from the 2008 general election ballot, Stephens introduced legislation that would prohibit financial deals that would impact initiatives on Colorado election ballots. The measure was defeated in a House committee.
2012 election
As a consequence of redistricting in the state of Colorado, Representative Stephens ran for the State House in the 19th House District. The seat was previously held by Republican legislator Marsha Looper since the 2006 general election. Representative Stephens faced third party challengers from the ACN and Libertarian parties, but was reelected with over 80% of the vote.
2013 Legislative Session
In the 2013 session of the Colorado General Assembly, Stephens sat as Ranking Member on the House Health, Insurance & Environment Committee and House Public Health Care & Human Services Committee.
Stephens played a critical role in passing a bill that identifies mandatory reporters for incidents of elder abuse and applies a class 3 misdemeanor as a penalty for failing to do so (SB 111 Require Reports Of Elder Abuse and Exploitation). She also made it easier for mental health care providers from other states to work in treatment facilities operated by the U.S. Armed Forces (HB 1065 Federal Professionals Mental Health Authority). Additionally, Stephens passed a piece of legislation targeted at reporting waste-prevention in health care (HB 1196 Report Waste-Prevention Methods Accountable Care), as well as another bill assisting home-schooled students in participating in extracurricular activities in public schools (HB 1095 Home School Students Participation in Activities).
2014 U.S. Senate race
Stephens organized a campaign for U.S. Senate to run against Mark Udall. On February 26, 2014, while in the process of raising campaign funds and gathering signatures to petition onto the June Republican primary ballot, she announced her intentions to drop out of the race and support Cory Gardner.
Colorado Health Benefit Exchange Act
Noting that the Affordable Care Act had already been passed in Colorado, Stephens and Senate President Pro-tem Betty Boyd co-sponsored bill SB-11-200, the Colorado Health Benefit Exchange Act, dubbed by some critics as “Amycare,” in an attempt to create a free market within the healthcare system. "AmyCare" was not well received by fellow Republicans and Coloradans in general. “The bill allowed individuals and small businesses to band together and negotiate in marketplaces for health care coverage the way large companies do.” “Stephens sponsored the legislation that created the Connect for Health Colorado insurance marketplace, arguing, “The only thing to decide was if we would exercise our state rights or put people in a federally run exchange.” Stephens set up an implementation review committee to provide oversight and hold the exchange accountable to the legislature and to state audit.
References
External links
Legislative home page
1957 births
California State University, Fullerton alumni
Focus on the Family people
Living people
Republican Party members of the Colorado House of Representatives
People from El Paso County, Colorado
University of California, Los Angeles alumni
University of Colorado alumni
Women state legislators in Colorado
21st-century American politicians
21st-century American women politicians
|
```smalltalk
using System.Reflection;
namespace AspectCore.DynamicProxy.Parameters
{
public interface IParameterInterceptorSelector
{
IParameterInterceptor[] Select(ParameterInfo parameter);
}
}
```
|
```batchfile
REM the TODIR (destination folder) should NOT contain a trailing '\', this script will append it
SETLOCAL
SET NXDIR=%1
SET TODIR=%2\
SET FNDDIR=%3
SET NVTXDIR=%4
SET GLDIR=%5
SET WINSDKDIR=%6
echo Copy64
echo "NXDIR = " %NXDIR%
echo "TARGET = " %TODIR%
echo "NVTXDIR = " %NVTXDIR%
echo FNDDIR = %FNDDIR%
echo GLDIR = %GLDIR%
echo WINSDKDIR = %WINSDKDIR%
IF "%2"=="" GOTO ARGUMENT_ERROR
CALL :UPDATE_TARGET %NXDIR% PhysXDevice64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CHECKED_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3DEBUG_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3PROFILE_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3Common_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CommonCHECKED_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CommonDEBUG_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CommonPROFILE_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CharacterKinematic_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CharacterKinematicCHECKED_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CharacterKinematicDEBUG_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CharacterKinematicPROFILE_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3Cooking_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CookingCHECKED_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CookingDEBUG_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3CookingPROFILE_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3Gpu_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3GpuCHECKED_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3GpuDEBUG_x64.dll
CALL :UPDATE_TARGET %NXDIR% PhysX3GpuPROFILE_x64.dll
CALL :UPDATE_TARGET %NVTXDIR% nvToolsExt*.dll
CALL :UPDATE_TARGET %FNDDIR% PxFoundation_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxFoundationCHECKED_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxFoundationPROFILE_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxFoundationDEBUG_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxPvdSDK_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxPvdSDKCHECKED_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxPvdSDKDEBUG_x64.dll
CALL :UPDATE_TARGET %FNDDIR% PxPvdSDKPROFILE_x64.dll
ENDLOCAL
GOTO END
REM ********************************************
REM NO CALLS TO :UPDATE_TARGET below this line!!
REM ********************************************
:UPDATE_TARGET
IF NOT EXIST %1\%2 (
echo File doesn't exist %1\%2
) ELSE (
XCOPY "%1\%2" "%TODIR%" /D /Y
)
GOTO END
:ARGUMENT_ERROR
ECHO ERROR: too few arguments to dll64copy.bat (need PhysXBinDir ApexBinDir)
:END
```
|
```c
/*
*
* specified in the README file that comes with this CVS source distribution.
*
* version.c - the CVS version number
*/
#include "cvs.h"
char *version_string = "Concurrent Versions System (CVS) 1.11.1p1";
#ifdef CLIENT_SUPPORT
#ifdef SERVER_SUPPORT
char *config_string = " (client/server)\n";
#else
char *config_string = " (client)\n";
#endif
#else
#ifdef SERVER_SUPPORT
char *config_string = " (server)\n";
#else
char *config_string = "\n";
#endif
#endif
static const char *const version_usage[] =
{
"Usage: %s %s\n",
NULL
};
/*
* Output a version string for the client and server.
*
* This function will output the simple version number (for the '--version'
* option) or the version numbers of the client and server (using the 'version'
* command).
*/
int
version (argc, argv)
int argc;
char **argv;
{
int err = 0;
if (argc == -1)
usage (version_usage);
#ifdef CLIENT_SUPPORT
if (current_parsed_root && current_parsed_root->isremote)
(void) fputs ("Client: ", stdout);
#endif
/* Having the year here is a good idea, so people have
some idea of how long ago their version of CVS was
released. */
(void) fputs (version_string, stdout);
(void) fputs (config_string, stdout);
#ifdef CLIENT_SUPPORT
if (current_parsed_root && current_parsed_root->isremote)
{
(void) fputs ("Server: ", stdout);
start_server ();
if (supported_request ("version"))
send_to_server ("version\012", 0);
else
{
send_to_server ("noop\012", 0);
fputs ("(unknown)\n", stdout);
}
err = get_responses_and_close ();
}
#endif
return err;
}
```
|
```objective-c
/*
*
*/
#pragma once
#ifdef __cplusplus
extern "C" {
#endif
#define DR_REG_CLINT_M_BASE ( 0x20001800)
#define DR_REG_CLINT_U_BASE ( 0x20001C00)
/*CLINT MINT*/
#define CLINT_MINT_SIP_REG (DR_REG_CLINT_M_BASE + 0x0)
/* CLINT_CPU_MINT_SIP : R/W ;bitpos:[0] ;default: 1'b0 ; */
/*description: .*/
#define CLINT_CPU_MINT_SIP 0xFFFFFFFF
#define CLINT_CPU_MINT_SIP_M ((CLINT_CPU_MINT_SIP_V)<<(CLINT_CPU_MINT_SIP_S))
#define CLINT_CPU_MINT_SIP_V 0xFFFFFFFF
#define CLINT_CPU_MINT_SIP_S 0
#define CLINT_MINT_TIMECTL_REG (DR_REG_CLINT_M_BASE + 0x4)
/* CLINT_MINT_SAMPLING_MODE : R/W ;bitpos:[5:4] ;default: 2'b0 ; */
/*description: .*/
#define CLINT_MINT_SAMPLING_MODE 0x00000003
#define CLINT_MINT_SAMPLING_MODE_M ((CLINT_CPU_MINT_TIMECTL_V)<<(CLINT_CPU_MINT_TIMECTL_S))
#define CLINT_MINT_SAMPLING_MODE_V 0x3
#define CLINT_MINT_SAMPLING_MODE_S 4
/* CLINT_MINT_COUNTER_OVERFLOW : R/W ;bitpos:[3] ;default: 1'b0 ; */
/*description: */
#define CLINT_MINT_COUNTER_OVERFLOW (BIT(3))
#define CLINT_MINT_COUNTER_OVERFLOW_M (BIT(3))
#define CLINT_MINT_COUNTER_OVERFLOW_V 0x1
#define CLINT_MINT_COUNTER_OVERFLOW_S 3
/* CLINT_MINT_TIMERINT_PENDING : R/W ;bitpos:[2] ;default: 1'b0 ; */
/*description: */
#define CLINT_MINT_TIMERINT_PENDING (BIT(2))
#define CLINT_MINT_TIMERINT_PENDING_M (BIT(2))
#define CLINT_MINT_TIMERINT_PENDING_V 0x1
#define CLINT_MINT_TIMERINT_PENDING_S 2
/* CLINT_MINT_TIMERINT_EN : R/W ;bitpos:[1] ;default: 1'b0 ; */
/*description: */
#define CLINT_MINT_TIMERINT_EN (BIT(1))
#define CLINT_MINT_TIMERINT_EN_M (BIT(1))
#define CLINT_MINT_TIMERINT_EN_V 0x1
#define CLINT_MINT_TIMERINT_EN_S 1
/* CLINT_MINT_COUNTER_EN : R/W ;bitpos:[0] ;default: 1'b0 ; */
/*description: */
#define CLINT_MINT_COUNTER_EN (BIT(0))
#define CLINT_MINT_COUNTER_EN_M (BIT(0))
#define CLINT_MINT_COUNTER_EN_V 0x1
#define CLINT_MINT_COUNTER_EN_S 0
#define CLINT_MINT_MTIME_L_REG (DR_REG_CLINT_M_BASE + 0x8)
/* CLINT_CPU_MINT_MTIME_L : R/W ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_MINT_MTIME_L 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIME_L_M ((CLINT_CPU_MINT_MTIME_L_V)<<(CLINT_CPU_MINT_MTIME_L_S))
#define CLINT_CPU_MINT_MTIME_L_V 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIME_L_S 0
#define CLINT_MINT_MTIME_H_REG (DR_REG_CLINT_M_BASE + 0xC)
/* CLINT_CPU_MINT_MTIME_H : RO ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_MINT_MTIME_H 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIME_H_M ((CLINT_CPU_MINT_MTIME_H_V)<<(CLINT_CPU_MINT_MTIME_H_S))
#define CLINT_CPU_MINT_MTIME_H_V 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIME_H_S 0
#define CLINT_MINT_MTIMECMP_L_REG (DR_REG_CLINT_M_BASE + 0x10)
/* CLINT_CPU_MINT_MTIMECMP_L : R/W ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_MINT_MTIMECMP_L 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIMECMP_L_M ((CLINT_CPU_MINT_MTIMECMP_L_V)<<(CLINT_CPU_MINT_MTIMECMP_L_S))
#define CLINT_CPU_MINT_MTIMECMP_L_V 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIMECMP_L_S 0
#define CLINT_MINT_MTIMECMP_H_REG (DR_REG_CLINT_M_BASE + 0x14)
/* CLINT_CPU_MINT_MTIMECMP_H : RO ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_MINT_MTIMECMP_H 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIMECMP_H_M ((CLINT_CPU_MINT_MTIMECMP_H_V)<<(CLINT_CPU_MINT_MTIMECMP_H_S))
#define CLINT_CPU_MINT_MTIMECMP_H_V 0xFFFFFFFF
#define CLINT_CPU_MINT_MTIMECMP_H_S 0
/*CLINT UINT*/
#define CLINT_UINT_SIP_REG (DR_REG_CLINT_U_BASE + 0x0)
/* CLINT_CPU_UINT_SIP : R/W ;bitpos:[0] ;default: 1'b1 ; */
/*description: .*/
#define CLINT_CPU_UINT_SIP 0xFFFFFFFF
#define CLINT_CPU_UINT_SIP_M ((CLINT_CPU_UINT_SIP_V)<<(CLINT_CPU_UINT_SIP_S))
#define CLINT_CPU_UINT_SIP_V 0xFFFFFFFF
#define CLINT_CPU_UINT_SIP_S 0
#define CLINT_UINT_TIMECTL_REG (DR_REG_CLINT_U_BASE + 0x4)
/* CLINT_UINT_SAMPLING_MODE : R/W ;bitpos:[5:4] ;default: 2'b0 ; */
/*description: .*/
#define CLINT_UINT_SAMPLING_MODE 0x00000003
#define CLINT_UINT_SAMPLING_MODE_M ((CLINT_CPU_UINT_TIMECTL_V)<<(CLINT_CPU_UINT_TIMECTL_S))
#define CLINT_UINT_SAMPLING_MODE_V 0x3
#define CLINT_UINT_SAMPLING_MODE_S 4
/* CLINT_UINT_COUNTER_OVERFLOW : R/W ;bitpos:[3] ;default: 1'b0 ; */
/*description: */
#define CLINT_UINT_COUNTER_OVERFLOW (BIT(3))
#define CLINT_UINT_COUNTER_OVERFLOW_M (BIT(3))
#define CLINT_UINT_COUNTER_OVERFLOW_V 0x1
#define CLINT_UINT_COUNTER_OVERFLOW_S 3
/* CLINT_UINT_TIMERINT_PENDING : R/W ;bitpos:[2] ;default: 1'b0 ; */
/*description: */
#define CLINT_UINT_TIMERINT_PENDING (BIT(2))
#define CLINT_UINT_TIMERINT_PENDING_M (BIT(2))
#define CLINT_UINT_TIMERINT_PENDING_V 0x1
#define CLINT_UINT_TIMERINT_PENDING_S 2
/* CLINT_UINT_TIMERINT_EN : R/W ;bitpos:[1] ;default: 1'b0 ; */
/*description: */
#define CLINT_UINT_TIMERINT_EN (BIT(1))
#define CLINT_UINT_TIMERINT_EN_M (BIT(1))
#define CLINT_UINT_TIMERINT_EN_V 0x1
#define CLINT_UINT_TIMERINT_EN_S 1
/* CLINT_UINT_COUNTER_EN : R/W ;bitpos:[0] ;default: 1'b0 ; */
/*description: */
#define CLINT_UINT_COUNTER_EN (BIT(0))
#define CLINT_UINT_COUNTER_EN_M (BIT(0))
#define CLINT_UINT_COUNTER_EN_V 0x1
#define CLINT_UINT_COUNTER_EN_S 0
#define CLINT_UINT_UTIME_L_REG (DR_REG_CLINT_U_BASE + 0x8)
/* CLINT_CPU_UINT_UTIME_L : R/W ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_UINT_UTIME_L 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIME_L_M ((CLINT_CPU_UINT_UTIME_L_V)<<(CLINT_CPU_UINT_UTIME_L_S))
#define CLINT_CPU_UINT_UTIME_L_V 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIME_L_S 0
#define CLINT_UINT_UTIME_H_REG (DR_REG_CLINT_U_BASE + 0xC)
/* CLINT_CPU_UINT_UTIME_H : RO ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_UINT_UTIME_H 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIME_H_M ((CLINT_CPU_UINT_UTIME_H_V)<<(CLINT_CPU_UINT_UTIME_H_S))
#define CLINT_CPU_UINT_UTIME_H_V 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIME_H_S 0
#define CLINT_UINT_UTIMECMP_L_REG (DR_REG_CLINT_U_BASE + 0x10)
/* CLINT_CPU_UINT_UTIMECMP_L : R/W ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_UINT_UTIMECMP_L 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIMECMP_L_M ((CLINT_CPU_UINT_UTIMECMP_L_V)<<(CLINT_CPU_UINT_UTIMECMP_L_S))
#define CLINT_CPU_UINT_UTIMECMP_L_V 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIMECMP_L_S 0
#define CLINT_UINT_UTIMECMP_H_REG (DR_REG_CLINT_U_BASE + 0x14)
/* CLINT_CPU_UINT_UTIMECMP_H : RO ;bitpos:[31:0] ;default: 32'h0 ; */
/*description: .*/
#define CLINT_CPU_UINT_UTIMECMP_H 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIMECMP_H_M ((CLINT_CPU_UINT_UTIMECMP_H_V)<<(CLINT_CPU_UINT_UTIMECMP_H_S))
#define CLINT_CPU_UINT_UTIMECMP_H_V 0xFFFFFFFF
#define CLINT_CPU_UINT_UTIMECMP_H_S 0
#ifdef __cplusplus
}
#endif
```
|
{{DISPLAYTITLE:C19H24O4}}
The molecular formula C19H24O4 (molar mass: 316.39 g/mol, exact mass: 316.1675 u) may refer to:
DHSA
Ferujol
|
```javascript
/**
* @fileoverview Enforces props default values to be valid.
* @author Armano
*/
'use strict'
const utils = require('../utils')
const { capitalize } = require('../utils/casing')
/**
* @typedef {import('../utils').ComponentProp} ComponentProp
* @typedef {import('../utils').ComponentObjectProp} ComponentObjectProp
* @typedef {import('../utils').ComponentArrayProp} ComponentArrayProp
* @typedef {import('../utils').ComponentTypeProp} ComponentTypeProp
* @typedef {import('../utils').ComponentInferTypeProp} ComponentInferTypeProp
* @typedef {import('../utils').ComponentUnknownProp} ComponentUnknownProp
* @typedef {import('../utils').VueObjectData} VueObjectData
*/
const NATIVE_TYPES = new Set([
'String',
'Number',
'Boolean',
'Function',
'Object',
'Array',
'Symbol',
'BigInt'
])
const FUNCTION_VALUE_TYPES = new Set(['Function', 'Object', 'Array'])
/**
* @param {ObjectExpression} obj
* @param {string} name
* @returns {Property | null}
*/
function getPropertyNode(obj, name) {
for (const p of obj.properties) {
if (
p.type === 'Property' &&
!p.computed &&
p.key.type === 'Identifier' &&
p.key.name === name
) {
return p
}
}
return null
}
/**
* @param {Expression} targetNode
* @returns {string[]}
*/
function getTypes(targetNode) {
const node = utils.skipTSAsExpression(targetNode)
if (node.type === 'Identifier') {
return [node.name]
} else if (node.type === 'ArrayExpression') {
return node.elements
.filter(
/**
* @param {Expression | SpreadElement | null} item
* @returns {item is Identifier}
*/
(item) => item != null && item.type === 'Identifier'
)
.map((item) => item.name)
}
return []
}
module.exports = {
meta: {
type: 'suggestion',
docs: {
description: 'enforce props default values to be valid',
categories: ['vue3-essential', 'vue2-essential'],
url: 'path_to_url
},
fixable: null,
schema: [],
messages: {
invalidType:
"Type of the default value for '{{name}}' prop must be a {{types}}."
}
},
/** @param {RuleContext} context */
create(context) {
/**
* @typedef {object} StandardValueType
* @property {string} type
* @property {false} function
*/
/**
* @typedef {object} FunctionExprValueType
* @property {'Function'} type
* @property {true} function
* @property {true} expression
* @property {Expression} functionBody
* @property {string | null} returnType
*/
/**
* @typedef {object} FunctionValueType
* @property {'Function'} type
* @property {true} function
* @property {false} expression
* @property {BlockStatement} functionBody
* @property {ReturnType[]} returnTypes
*/
/**
* @typedef { ComponentObjectProp & { value: ObjectExpression } } ComponentObjectDefineProp
* @typedef { { type: string, node: Expression } } ReturnType
*/
/**
* @typedef {object} PropDefaultFunctionContext
* @property {ComponentObjectProp | ComponentTypeProp | ComponentInferTypeProp} prop
* @property {Set<string>} types
* @property {FunctionValueType} default
*/
/**
* @type {Map<ObjectExpression, PropDefaultFunctionContext[]>}
*/
const vueObjectPropsContexts = new Map()
/**
* @type { {node: CallExpression, props:PropDefaultFunctionContext[]}[] }
*/
const scriptSetupPropsContexts = []
/**
* @typedef {object} ScopeStack
* @property {ScopeStack | null} upper
* @property {BlockStatement | Expression} body
* @property {null | ReturnType[]} [returnTypes]
*/
/**
* @type {ScopeStack | null}
*/
let scopeStack = null
function onFunctionExit() {
scopeStack = scopeStack && scopeStack.upper
}
/**
* @param {Expression} targetNode
* @returns { StandardValueType | FunctionExprValueType | FunctionValueType | null }
*/
function getValueType(targetNode) {
const node = utils.skipChainExpression(targetNode)
switch (node.type) {
case 'CallExpression': {
// Symbol(), Number() ...
if (
node.callee.type === 'Identifier' &&
NATIVE_TYPES.has(node.callee.name)
) {
return {
function: false,
type: node.callee.name
}
}
break
}
case 'TemplateLiteral': {
// String
return {
function: false,
type: 'String'
}
}
case 'Literal': {
// String, Boolean, Number
if (node.value === null && !node.bigint) return null
const type = node.bigint ? 'BigInt' : capitalize(typeof node.value)
if (NATIVE_TYPES.has(type)) {
return {
function: false,
type
}
}
break
}
case 'ArrayExpression': {
// Array
return {
function: false,
type: 'Array'
}
}
case 'ObjectExpression': {
// Object
return {
function: false,
type: 'Object'
}
}
case 'FunctionExpression': {
return {
function: true,
expression: false,
type: 'Function',
functionBody: node.body,
returnTypes: []
}
}
case 'ArrowFunctionExpression': {
if (node.expression) {
const valueType = getValueType(node.body)
return {
function: true,
expression: true,
type: 'Function',
functionBody: node.body,
returnType: valueType ? valueType.type : null
}
}
return {
function: true,
expression: false,
type: 'Function',
functionBody: node.body,
returnTypes: []
}
}
}
return null
}
/**
* @param {*} node
* @param {ComponentObjectProp | ComponentTypeProp | ComponentInferTypeProp} prop
* @param {Iterable<string>} expectedTypeNames
*/
function report(node, prop, expectedTypeNames) {
const propName =
prop.propName == null
? `[${context.getSourceCode().getText(prop.node.key)}]`
: prop.propName
context.report({
node,
messageId: 'invalidType',
data: {
name: propName,
types: [...expectedTypeNames].join(' or ').toLowerCase()
}
})
}
/**
* @param {(ComponentObjectDefineProp | ComponentTypeProp | ComponentInferTypeProp)[]} props
* @param { { [key: string]: Expression | undefined } } withDefaults
*/
function processPropDefs(props, withDefaults) {
/** @type {PropDefaultFunctionContext[]} */
const propContexts = []
for (const prop of props) {
let typeList
let defExpr
if (prop.type === 'object') {
const type = getPropertyNode(prop.value, 'type')
if (!type) continue
typeList = getTypes(type.value)
const def = getPropertyNode(prop.value, 'default')
if (!def) continue
defExpr = def.value
} else {
typeList = prop.types
defExpr = withDefaults[prop.propName]
}
if (!defExpr) continue
const typeNames = new Set(
typeList.filter((item) => NATIVE_TYPES.has(item))
)
// There is no native types detected
if (typeNames.size === 0) continue
const defType = getValueType(defExpr)
if (!defType) continue
if (defType.function) {
if (typeNames.has('Function')) {
continue
}
if (defType.expression) {
if (!defType.returnType || typeNames.has(defType.returnType)) {
continue
}
report(defType.functionBody, prop, typeNames)
} else {
propContexts.push({
prop,
types: typeNames,
default: defType
})
}
} else {
if (
typeNames.has(defType.type) &&
!FUNCTION_VALUE_TYPES.has(defType.type)
) {
continue
}
report(
defExpr,
prop,
[...typeNames].map((type) =>
FUNCTION_VALUE_TYPES.has(type) ? 'Function' : type
)
)
}
}
return propContexts
}
return utils.compositingVisitors(
{
/**
* @param {FunctionExpression | FunctionDeclaration | ArrowFunctionExpression} node
*/
':function'(node) {
scopeStack = {
upper: scopeStack,
body: node.body,
returnTypes: null
}
},
/**
* @param {ReturnStatement} node
*/
ReturnStatement(node) {
if (!scopeStack) {
return
}
if (scopeStack.returnTypes && node.argument) {
const type = getValueType(node.argument)
if (type) {
scopeStack.returnTypes.push({
type: type.type,
node: node.argument
})
}
}
},
':function:exit': onFunctionExit
},
utils.defineVueVisitor(context, {
onVueObjectEnter(obj) {
/** @type {ComponentObjectDefineProp[]} */
const props = utils.getComponentPropsFromOptions(obj).filter(
/**
* @param {ComponentObjectProp | ComponentArrayProp | ComponentUnknownProp} prop
* @returns {prop is ComponentObjectDefineProp}
*/
(prop) =>
Boolean(
prop.type === 'object' && prop.value.type === 'ObjectExpression'
)
)
const propContexts = processPropDefs(props, {})
vueObjectPropsContexts.set(obj, propContexts)
},
/**
* @param {FunctionExpression | FunctionDeclaration | ArrowFunctionExpression} node
* @param {VueObjectData} data
*/
':function'(node, { node: vueNode }) {
const data = vueObjectPropsContexts.get(vueNode)
if (!data || !scopeStack) {
return
}
for (const { default: defType } of data) {
if (node.body === defType.functionBody) {
scopeStack.returnTypes = defType.returnTypes
}
}
},
onVueObjectExit(obj) {
const data = vueObjectPropsContexts.get(obj)
if (!data) {
return
}
for (const { prop, types: typeNames, default: defType } of data) {
for (const returnType of defType.returnTypes) {
if (typeNames.has(returnType.type)) continue
report(returnType.node, prop, typeNames)
}
}
}
}),
utils.defineScriptSetupVisitor(context, {
onDefinePropsEnter(node, baseProps) {
const props = baseProps.filter(
/**
* @param {ComponentProp} prop
* @returns {prop is ComponentObjectDefineProp | ComponentInferTypeProp | ComponentTypeProp}
*/
(prop) =>
Boolean(
prop.type === 'type' ||
prop.type === 'infer-type' ||
(prop.type === 'object' &&
prop.value.type === 'ObjectExpression')
)
)
const defaults = utils.getWithDefaultsPropExpressions(node)
const propContexts = processPropDefs(props, defaults)
scriptSetupPropsContexts.push({ node, props: propContexts })
},
/**
* @param {FunctionExpression | FunctionDeclaration | ArrowFunctionExpression} node
*/
':function'(node) {
const data =
scriptSetupPropsContexts[scriptSetupPropsContexts.length - 1]
if (!data || !scopeStack) {
return
}
for (const { default: defType } of data.props) {
if (node.body === defType.functionBody) {
scopeStack.returnTypes = defType.returnTypes
}
}
},
onDefinePropsExit() {
scriptSetupPropsContexts.pop()
}
})
)
}
}
```
|
Hastings was a parliamentary constituency in Sussex. It returned two Members of Parliament to the House of Commons of the Parliament of the United Kingdom until the 1885 general election, when its representation was reduced to one member. It was abolished for the 1983 general election, when it was partially replaced by the new Hastings and Rye constituency.
Boundaries
1918–1950: The County Borough of Hastings.
1950–1955: The County Borough of Hastings, the Municipal Borough of Rye, and the Rural District of Battle (except the parishes of Burwash, Etchingham and Ticehurst).
1955–1983: The County Borough of Hastings.
Members of Parliament
MPs 1366–1640
MPs 1640–1885
MPs 1885–1983
Elections
Elections in the 1830s
The votes for Warre, Cave and Taddy were rejected by the mayor.
Elections in the 1840s
Planta resigned by accepting the office of Steward of the Chiltern Hundreds, causing a by-election.
Elections in the 1850s
Brisco resigned by accepting the office of Steward of the Chiltern Hundreds, causing a by-election.
Elections in the 1860s
Powlett succeeded to the peerage, becoming Duke of Cleveland, and causing a by-election.
North's death caused a by-election.
Elections in the 1870s
Elections in the 1880s
Brassey was appointed a Civil Lord of the Admiralty, requiring a by-election.
Murray resigned, causing a by-election.
Elections in the 1890s
Elections in the 1900s
Elections in the 1910s
General Election 1914/15
Another General Election was required to take place before the end of 1915. The political parties had been making preparations for an election to take place and by the July 1914, the following candidates had been selected;
Unionist: Arthur Du Cros
Liberal: Cecil Patrick Black
Elections in the 1920s
Elections in the 1930s
General Election 1939/40
Another General Election was required to take place before the end of 1940. The political parties had been making preparations for an election to take place and by the Autumn of 1939, the following candidates had been selected;
Conservative: Maurice Hely-Hutchinson
Labour: William Wate Wood
Elections in the 1940s
Elections in the 1950s
Elections in the 1960s
Elections in the 1970s
References
Robert Beatson, A Chronological Register of Both Houses of Parliament (London: Longman, Hurst, Res & Orme, 1807)
D Brunton & D H Pennington, Members of the Long Parliament (London: George Allen & Unwin, 1954)
Cobbett's Parliamentary history of England, from the Norman Conquest in 1066 to the year 1803 (London: Thomas Hansard, 1808)
F W S Craig, British Parliamentary Election Results 1832–1885 (2nd edition, Aldershot: Parliamentary Research Services, 1989)
J E Neale, The Elizabethan House of Commons (London: Jonathan Cape, 1949)
Parliamentary constituencies in South East England (historic)
Constituencies of the Parliament of the United Kingdom established in 1366
Constituencies of the Parliament of the United Kingdom disestablished in 1983
Politics of East Sussex
Politics of Hastings
Cinque ports parliament constituencies
|
Pyomyositis is a bacterial infection of the skeletal muscles which results in an abscess. Pyomyositis is most common in tropical areas but can also occur in temperate zones.
Pyomyositis can be classified as primary or secondary. Primary pyomyositis is a skeletal muscle infection arising from hematogenous infection, whereas secondary pyomyositis arises from localized penetrating trauma or contiguous spread to the muscle.
Diagnosis
Diagnosis is done via the following manner:
Pus discharge culture and sensitivity
X ray of the part to rule out osteomyelitis
Creatinine phosphokinase (more than 50,000 units)
MRI is useful
Ultrasound guided aspiration
Treatment
The abscesses within the muscle must be drained surgically (not all patient require surgery if there is no abscess). Antibiotics are given for a minimum of three weeks to clear the infection.
Epidemiology
Pyomyositis is most often caused by the bacterium Staphylococcus aureus. The infection can affect any skeletal muscle, but most often infects the large muscle groups such as the quadriceps or gluteal muscles.
Pyomyositis is mainly a disease of children and was first described by Scriba in 1885. Most patients are aged 2 to 5 years, but infection may occur in any age group. Infection often follows minor trauma and is more common in the tropics, where it accounts for 4% of all hospital admissions. In temperate countries such as the US, pyomyositis was a rare condition (accounting for 1 in 3000 pediatric admissions), but has become more common since the appearance of the USA300 strain of MRSA.
Gonococcal pyomyositis is a rare infection caused by Neisseria gonorrhoeae.
Additional images
References
Maravelas R, Melgar TA, Vos D, Lima N, Sadarangani S (2020). "Pyomyositis in the United States 2002-2014". J Infect. 80(5):497-503. doi:10.1016/j.jinf.2020.02.005. .
External links
Bacterial diseases
Bacterium-related cutaneous conditions
Muscular disorders
Inflammations
|
Easy JavaScript Simulations (EJSS), formerly known as Easy Java Simulations (EJS), is an open-source software tool, part of the Open Source Physics project, designed for the creation of discrete computer simulations.
A discrete computer simulation, or simply a computer simulation, is a computer program that tries to reproduce, for pedagogical or scientific purposes, a natural phenomenon through the visualization of the different states that it can have. Each of these states is described by a set of variables that change in time due to the iteration of a given algorithm.
In creating a simulation with the help of EJSS, the user does not program the simulation at the level of writing code, instead the user is working at a higher conceptual level, declaring and organizing the equations and other mathematical expressions that operate the simulation. EJSS handles the technical aspects of coding the simulation in the Java programming language, thus freeing the user to concentrate on the simulation's content.
The generated Java or JavaScript code can, in terms of efficiency and sophistication, be taken as the creation of a professional programmer.
EJSS is written in the Java programming language and the created simulation are in Java or JavaScript. Java Virtual Machines (JVM) are available for many different platforms; a platform for which a JVM is available can run Java programs. Though Java applets were popular before 2014, JavaScript Applets outputs can be run on almost any device now, including Android and iOS.
EJSS has its own format for storing the simulations, which is based on XML, EJS and EJSS and carries the extension .xml, .ejs and .ejss. It contains not only the code for the simulation, but also the rest of the things, like the html introduction.
References
Wolfgang Christian and Francisco Esquembre, Modeling Physics with Easy Java Simulations The Physics Teacher, Volume 45, Issue 8, November 2007, pp. 468–528
Francisco Esquembre, "Easy Java Simulations: a software tool to create scientific simulations in Java", Computer Physics Communications, Volume 156, Issue 2, 1 January 2004, Pages 199-204
Anne Cox, Computational Modeling in Intro Physics Labs: Tracker and EJS, 2009 American Association of Physics Teachers Summer Meeting
External links
Simulation software
Plotting software
Cross-platform software
Free software programmed in Java (programming language)
|
David Maxwell Fenbury (24 March 1916 – 14 May 1976) was an Australian public servant who spent most of his career in the Territory of Papua and New Guinea.
Early life and education
Fenbury was born David Maxwell Fienberg in the Subiaco suburb of Perth in 1916, the third child of railway official David Percival Fienberg and Beatrice Amelia (née Conroy). He attended Christian Brothers' College in Perth and the University of Western Australia, graduating with a degree in Arts in 1937. Whilst at university he edited the university magazine, the Pelican.
Career
In 1937, Fenbury became a cadet patrol officer in the Territory of New Guinea. He joined the Australian Imperial Force in 1941, becoming a lieutenant the following year when he joined the Australian New Guinea Administrative Unit. For the next three years he led guerilla operations against the Japanese invaders, during which time he was promoted to captain, mentioned in dispatches and awarded the Military Cross.
In 1946, Fenbury was seconded to the British Colonial Office to gain knowledge of colonial administration. He was posted to in Tanganyika, where he learned about the village council system.
On 15 May 1948, at St Mark's Anglican Church, East Brighton, Melbourne, Fenbury married Joan Marion Brazier; the couple had two children. He returned to Papua New Guinea the same year and was given responsibility for implementing a new system of local government. In 1954 he moved to Port Moresby and was appointed a District Commissioner in 1955.
In 1956, he was the Australian government's nominee to the secretariat of the Trusteeship Council of the United Nations, a role he held for two years, before returning to Papua and New Guinea to become Secretary to the Department of the Administrator in 1960. In the same year changed his surname from Fienberg to Fenbury by deed poll. After his first wife died in 1964, he married Helen Mary Shiels in a civil ceremony in Port Moresby in May 1966.
In 1969, he was appointed Secretary of the new Department of Social Development and Home Affairs, where he remained until retiring in 1973. The following year he took on a visiting fellowship at the Australian National University. He started writing a book, Practise without policy, completing six chapters before his death. The book was published posthumously in 1978.
Death
Fenbury died on 14 May 1976 from injuries received as a result of being hit by a bus in Leederville, a suburb of Perth. He was survived by his wife, and his two children from his first marriage.
References
1916 births
People from Perth, Western Australia
People educated at Christian Brothers' College, Perth
Australian public servants
Australian Army personnel of World War II
Australian recipients of the Military Cross
Territory of New Guinea people
Territory of Papua and New Guinea people
Papua New Guinean civil servants
Burials at Karrakatta Cemetery
1976 deaths
Australian Army officers
Road incident deaths in Western Australia
|
```javascript
Asynchronous File Write/Read in Node.js
Http Server in **Node**
Streams in **Node**
`uncaughtException` listener in Node.js
Automatic compilation for Node with **Nodemon**
```
|
Georg von Hantelmann (1898-1924) was a German First World War fighter ace credited with 25 confirmed aerial victories. Most notably, he shot down ten opponents within a week, including three aces.
The victory list
The victories of Georg von Hantelmann are reported in chronological order, which is not necessarily the order or dates the victories were confirmed by headquarters.
Abbreviations were expanded by the editor creating this list.
Endnotes
References
Aerial victories of Hantelmann, Georg von
Hantelmann, Georg von
|
Events from the year 1781 in Great Britain.
Incumbents
Monarch – George III
Prime Minister – Frederick North, Lord North (Tory)
Parliament – 15th
Events
1 January – Industrial Revolution: The Iron Bridge opens across the River Severn.
3 February – Fourth Anglo-Dutch War: Capture of Sint Eustatius – British forces led by General John Vaughan and Admiral George Rodney take the Dutch Caribbean island of Sint Eustatius, with only a few shots fired. On 26 November it is retaken by Dutch-allied French forces.
5 January – American Revolutionary War: Richmond, Virginia, is burned by British naval forces led by Benedict Arnold.
6 January – Battle of Jersey: British troops prevent the French from occupying Jersey in the Channel Islands.
17 January – American Revolutionary War: the American Continental Army under Daniel Morgan decisively defeats British forces at the Battle of Cowpens in South Carolina.
January – William Pitt the Younger, later Prime Minister, enters Parliament, aged 21.
3 February – American Revolutionary War and Fourth Anglo-Dutch War: The Dutch Caribbean island of Sint Eustatius (which has been supplying the United States) surrenders to Admiral Rodney.
28 February – foundation of the Literary and Philosophical Society of Manchester.
13 March – Sir William Herschel discovers the planet Uranus. Originally he calls it Georgium Sidus (George's Star) in honour of King George III.
15 March – American Revolutionary War: American General Nathanael Greene loses the Battle of Guilford Court House to British.
1 July – Second Anglo-Mysore War: at the Battle of Porto Novo, the British defeat the Mysore ruler Hyder Ali.
6 July – American Revolutionary War: At the Battle of Green Spring, the British led by Lord Cornwallis defeat the French led by the Marquis de Lafayette.
27 July – French spy François Henri de la Motte executed at Tyburn (London) for high treason.
30 August – American Revolutionary War: French fleet under the Comte de Grasse enters Chesapeake Bay, cutting British General Charles Cornwallis off from escape by sea.
5 September – American Revolutionary War: in the Battle of the Chesapeake, a British fleet under Thomas Graves arrives and fights de Grasse, but is unable to break through to relieve the Siege of Yorktown.
6 September – American Revolutionary War: Battle of Groton Heights – a British force under Benedict Arnold attacks a fort in Groton, Connecticut, achieving a strategic victory.
19 October – American Revolutionary War: following the Siege of Yorktown, General Charles Cornwallis surrenders to General George Washington at Yorktown, Virginia, ending the armed struggle of the American Revolutionary War.
29 November
Zong massacre: English slave traders begin to throw approximately 142 slaves taken on in Accra overboard alive from the slave ship Zong in the Caribbean Sea to conserve supplies for the remainder; the Liverpool owners subsequently attempt to reclaim part of their value from insurers.
Henry Hurle officially founds the Ancient Order of Druids in London.
3 December – first known building society established, in Birmingham.
12 December – American Revolutionary War: Second Battle of Ushant – the Royal Navy, commanded by Rear Admiral Richard Kempenfelt in , decisively defeats the French fleet in the Bay of Biscay.
Last year in which the monarch participates in a regular peacetime meeting of the Cabinet.
Publications
Peter Beckford's Thoughts on Hunting.
Edward Gibbon's The History of the Decline and Fall of the Roman Empire, volumes 2 and 3.
John Wood, the Younger's pattern book A Series of Plans for Cottages or Habitations of the Labourer.
The collection of children's poetry Mother Goose's Melody.
Births
21 February – Bulkeley Bandinel, scholar-librarian (died 1861)
29 May – John Walker, inventor (died 1859)
9 June – George Stephenson, locomotive engineer (died 1848)
6 July – Stamford Raffles, founder of Singapore (died 1826)
8 July – Tom Cribb, bare-knuckle boxer (died 1848)
14 September – James Walker, Scottish civil engineer (died 1862)
3 November – Sarah Elizabeth Utterson, translator and author (died 1851)
6 November – Lucy Aikin, English writer (died 1864)
30 November – Alexander Berry, adventurer and Australian pioneer (died 1873)
11 December – Sir David Brewster, physicist (died 1868)
Deaths
12 January – Richard Challoner, Catholic prelate (born 1691)
21 February – Matchem, racehorse (born 1748)
24 February – Edward Capell, critic (born 1713)
19 April – Elizabeth Raffald, cookery writer and entrepreneur (born 1733)
23 April – James Abercrombie, general (born 1706)
8 May – Richard Jago, poet (born 1715)
17 May – William Aislabie, politician (born 1700)
28 September – William Henry Nassau de Zuylestein, 4th Earl of Rochford, diplomat and statesman (born 1717)
16 October – Edward Hawke, 1st Baron Hawke, naval officer (born 1705)
See also
1781 in Wales
References
Years in Great Britain
1781 by country
1781 in Europe
1780s in Great Britain
|
Donguang Church (), located in Shenyang, Liaoning Province, China, is one of the largest and oldest Protestant churches in Northeast China. It is also known as the cradle of Christianity of the Koreans in China and in the Korean Peninsula.
General
Donguan Church, built in the second half of the 19th century, is located in Shenyang, Liaoning Province, China. It is one of the largest and oldest Protestant churches in Northeast China. It was so named because it was built just outside East Gate, also called Dongguan (East Barrier), as a church was not allowed within the city wall.
John Ross (his Chinese name: ), sent by the United Presbyterian Church, Scotland, to Manchuria, went first to Yingkou, then moved to Mukden (Shenyang) and established a church here in 1889. This church building was destroyed during the Boxer Rebellion in 1900, but was reconstructed in 1907. It was damaged during the Cultural Revolution, yet enlarged in 1992. An Annex was built in 1998. The centennial of the 1907 church building dedication was celebrated in 2007.
Today, Christian worship is held in the main Church building and, on Sundays (7:00, 9:00 and 11:00 am), relayed also to the 5-storey Annex building behind, by cable TV. On the right of the main building is John Ross Memorial Hall, which is also open to the public.
First Korean Bible translation
While in China, John Ross met traders from Korea one day, and decided to get a Korean translation of the New Testament Bible, which was completed in 1887 and brought to Korea.
References
See also
Christianity in China
Christianity in Korea
Korean Bible translation
Xita Church
Churches in Shenyang
Protestant churches in China
|
```java
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.flowable.common.engine.impl;
/**
* @author Valentin Zickner
*/
public interface DefaultTenantProvider {
String getDefaultTenant(String tenantId, String scope, String scopeKey);
}
```
|
London Express is the second album by Mexican alternative rock vocalist, Elan.
London Express finds its roots in the music of The Beatles, which Elan has described as "the only band that really changed everything".
The first single off the album was the opener track, Be Free.
Track listing
Be Free (5:07)
Whatever It Takes (3:54)
Don't Worry (3:04)
Devil in Me (5:16)
Like Me (3:31)
London Express (3:03)
This Fool's Life (3:39)
Nobody Knows (7:14)
Someday I Will Be (5:17)
The Big Time (3:34)
Glow (3:56)
Sweet Little You (3:05)
Get Your Blue (4:54)
Singles
"Be Free"
"This Fool's Life"
"Whatever It Takes"
Be Free
"Be Free" is the first single taken from the album London Express. It's also the opening song of that album.
Track listing
"Be Free" (radio edit)
"Be Free" (album version)
References
External links
London Express official microsite
2005 albums
Elán (musician) albums
|
```yaml
swagger: "2.0"
info:
title: EngineerCMS API
description: |
ECMS has every tool to get any job done, so codename for the new ECMS APIs.
version: 1.0.0
contact:
email: 504284@qq.com
basePath: /v1
paths:
/admin/:
get:
tags:
- admin
description: |-
get admin page
<br>
operationId: AdminController.getAdminBlock
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page
"404":
description: page not found
/admin/category/{id}:
get:
tags:
- admin
description: |-
Get Category list by some info
<br>
operationId: AdminController.Get Category list
parameters:
- in: path
name: id
description: category id
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAdminCategory'
/admin/category/addcategory:
post:
tags:
- admin
description: |-
Get Category list by title info
<br>
operationId: AdminController.Post Category by pid title code grade
parameters:
- in: query
name: pid
description: parentid of category
type: string
- in: query
name: title
description: title of category
type: string
- in: query
name: code
description: code of category
type: string
- in: query
name: grade
description: grade of category
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddAdminCategory'
/admin/categorytitle:
get:
tags:
- admin
description: |-
Get Category list by title info
<br>
operationId: AdminController.Get Category by title
parameters:
- in: query
name: title
description: title of search
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAdminCategory'
/admin/flowaccesscontext:
post:
tags:
- admin
description: |-
post AccessContext..
<br>
operationId: FlowController.post wf AccessContext...
parameters:
- in: query
name: name
description: The name of AccessContext
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowaccesscontextlist:
get:
tags:
- admin
description: |-
post AccessContext..
<br>
operationId: FlowController.get wf AccessContext...
parameters:
- in: query
name: page
description: The page of AccessContext
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowaccesscontextupdate:
post:
tags:
- admin
description: |-
post workflowdocaccesscontext..
<br>
operationId: FlowController.post wf docaccesscontext...
parameters:
- in: query
name: name
description: The name of docaccesscontext
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowaction:
post:
tags:
- admin
description: |-
post workflowdocaction..
<br>
operationId: FlowController.post wf docaction...
parameters:
- in: query
name: name
description: The name of docaction
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowactiondelete:
post:
tags:
- admin
description: |-
post workflowdocaction..
<br>
operationId: FlowController.post wf docaction...
parameters:
- in: query
name: name
description: The name of docaction
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowactionlist:
get:
tags:
- admin
description: |-
get workflowdocaction..
<br>
operationId: FlowController.get wf docaction...
parameters:
- in: query
name: page
description: The page of docaction
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowactionupdate:
post:
tags:
- admin
description: |-
post workflowdocaction..
<br>
operationId: FlowController.post wf docaction...
parameters:
- in: query
name: name
description: The name of docaction
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdoc:
post:
tags:
- admin
description: |-
post document
<br>
operationId: FlowController.post wf document
parameters:
- in: query
name: dtid
description: The doctypeid of document
required: true
type: string
- in: query
name: acid
description: The accesscontext of document
required: true
type: string
- in: query
name: gid
description: The groupid of Group
required: true
type: string
- in: query
name: name
description: The name of document
required: true
type: string
- in: query
name: data
description: The data of document
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdocevent:
post:
tags:
- admin
description: |-
get docevent
<br>
operationId: FlowController.post wf event
parameters:
- in: query
name: name
description: The name of event
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdoceventlist:
get:
tags:
- admin
description: |-
get doceventlist
<br>
operationId: FlowController.get wf eventlist
parameters:
- in: query
name: page
description: The page of event
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdoclist:
get:
tags:
- admin
description: |-
get workflow doclist
<br>
operationId: FlowController.get wf doclist
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: page
description: The page of doc
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdocumentdetail:
get:
tags:
- admin
description: |-
get documentdetail
<br>
operationId: FlowController.get wf document details
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: docid
description: The id of doc
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdocumentlist:
get:
tags:
- admin
description: |-
get document
<br>
operationId: FlowController.get wf document
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: acid
description: The id of accesscontext
required: true
type: string
- in: query
name: gid
description: The id of group
type: string
- in: query
name: dsid
description: The id of docstate
type: string
- in: query
name: page
description: The page of doc
required: true
type: string
- in: query
name: limit
description: The limit page of doc
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowdocumentlist2:
get:
tags:
- admin
description: |-
get document
<br>
operationId: FlowController.get wf document
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: acid
description: The id of accesscontext
required: true
type: string
- in: query
name: gid
description: The id of group
type: string
- in: query
name: dsid
description: The id of docstate
type: string
- in: query
name: page
description: The page of doc
required: true
type: string
- in: query
name: limit
description: The limit page of doc
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowgroup:
post:
tags:
- admin
description: |-
post Group..
<br>
operationId: FlowController.post wf Group...
parameters:
- in: query
name: name
description: The name of Group
required: true
type: string
- in: query
name: grouptype
description: The type of Group
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowgrouplist:
get:
tags:
- admin
description: |-
post Group..
<br>
operationId: FlowController.get wf Group...
parameters:
- in: query
name: page
description: The page of Group
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowgroupmailbox:
get:
tags:
- admin
description: |-
get groupmailbox
<br>
operationId: FlowController.get group mailbox
parameters:
- in: query
name: gid
description: The id of group
required: true
type: string
- in: query
name: page
description: The page of mailbox
required: true
type: string
- in: query
name: limit
description: The limit page of mailbox
type: string
- in: query
name: unread
description: The unread of mailbox
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowgrouprole:
post:
tags:
- admin
description: |-
post GroupRole..
<br>
operationId: FlowController.post wf GroupRole...
parameters:
- in: query
name: name
description: The name of GroupRole
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowgrouprolelist:
get:
tags:
- admin
description: |-
get GroupRole..
<br>
operationId: FlowController.get wf GroupRole...
parameters:
- in: query
name: page
description: The page of GroupRole
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowgroupuserslist:
get:
tags:
- admin
description: |-
post Group..
<br>
operationId: FlowController.get wf GroupUsers...
parameters:
- in: query
name: name
description: The name of GroupUser
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flownext:
post:
tags:
- admin
description: |-
post workflow next
<br>
operationId: FlowController.post wf next
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: daid
description: The id of action
required: true
type: string
- in: query
name: docid
description: The id of document
required: true
type: string
- in: query
name: gid
description: The id of group
required: true
type: string
- in: query
name: text
description: The text of apply
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flownode:
post:
tags:
- admin
description: |-
post Node..
<br>
operationId: FlowController.post wf Node...
parameters:
- in: query
name: name
description: The name of Node
required: true
type: string
- in: query
name: dtid
description: The doctypeid of Node
required: true
type: string
- in: query
name: dsid
description: The docstateid of Node
required: true
type: string
- in: query
name: acid
description: The accesssid of Node
required: true
type: string
- in: query
name: nodetype
description: The nodetype of Node
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flownodelist:
get:
tags:
- admin
description: |-
post Node..
<br>
operationId: FlowController.get wf Node...
parameters:
- in: query
name: page
description: The page of Node
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowpermission:
post:
tags:
- admin
description: |-
post Permission..
<br>
operationId: FlowController.post wf Permission...
parameters:
- in: query
name: name
description: The name of Permission
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowrole:
post:
tags:
- admin
description: |-
post Role..
<br>
operationId: FlowController.post wf Role...
parameters:
- in: query
name: name
description: The name of Role
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowrolelist:
get:
tags:
- admin
description: |-
post Role..
<br>
operationId: FlowController.get wf Role...
parameters:
- in: query
name: page
description: The page of Role
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowrolepermissionlist:
get:
tags:
- admin
description: |-
post Permission..
<br>
operationId: FlowController.get wf Permission...
parameters:
- in: query
name: page
description: The page of Permission
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowstate:
post:
tags:
- admin
description: |-
post workflowdocstate..
<br>
operationId: FlowController.post wf docstate...
parameters:
- in: query
name: name
description: The name of docstate
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowstatedelete:
post:
tags:
- admin
description: |-
post workflowdocstate..
<br>
operationId: FlowController.post wf docstate...
parameters:
- in: query
name: name
description: The name of docstate
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowstatelist:
get:
tags:
- admin
description: |-
post workflowdocstate..
<br>
operationId: FlowController.post wf docstate...
parameters:
- in: query
name: page
description: The page of docstate
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowstateupdate:
post:
tags:
- admin
description: |-
post workflowdocstate..
<br>
operationId: FlowController.post wf docstate...
parameters:
- in: query
name: name
description: The name of docstate
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtransition:
post:
tags:
- admin
description: |-
post transition..
<br>
operationId: FlowController.post wf transition...
parameters:
- in: query
name: name
description: The name of transition
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtransitiondelete:
post:
tags:
- admin
description: |-
post workflowdoctransition..
<br>
operationId: FlowController.post wf doctransition...
parameters:
- in: query
name: name
description: The name of doctransition
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtransitionlist:
get:
tags:
- admin
description: |-
post transition..
<br>
operationId: FlowController.get wf transition...
parameters:
- in: query
name: page
description: The page of transition
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtransitionupdate:
post:
tags:
- admin
description: |-
post workflowdoctransition..
<br>
operationId: FlowController.post wf doctransition...
parameters:
- in: query
name: name
description: The name of doctransition
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtype:
post:
tags:
- admin
description: |-
post workflowdoctype..
<br>
operationId: FlowController.post wf doctype...
parameters:
- in: query
name: name
description: The name of doctype
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtypedelete:
post:
tags:
- admin
description: |-
post workflowdoctype..
<br>
operationId: FlowController.post wf doctype...
parameters:
- in: query
name: name
description: The name of doctype
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtypelist:
get:
tags:
- admin
description: |-
get workflowdoctype..
<br>
operationId: FlowController.get wf doctypelist...
parameters:
- in: query
name: page
description: The page of doctype
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowtypeupdate:
post:
tags:
- admin
description: |-
post workflowdoctype..
<br>
operationId: FlowController.post wf doctype...
parameters:
- in: query
name: name
description: The name of doctype
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowuser:
post:
tags:
- admin
description: |-
post user..
<br>
operationId: FlowController.post wf user...
parameters:
- in: query
name: firstname
description: The firstname of user
required: true
type: string
- in: query
name: lastname
description: The lastname of user
required: true
type: string
- in: query
name: email
description: The email of user
required: true
type: string
- in: query
name: active
description: The active of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowusergroup:
post:
tags:
- admin
description: |-
post Group..
<br>
operationId: FlowController.post wf GroupUser...
parameters:
- in: query
name: name
description: The name of GroupUser
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowuserlist:
get:
tags:
- admin
description: |-
post user..
<br>
operationId: FlowController.get wf user...
parameters:
- in: query
name: page
description: The page of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowusermailbox:
get:
tags:
- admin
description: |-
get usermailbox
<br>
operationId: FlowController.get user mailbox
parameters:
- in: query
name: uid
description: The id of user
required: true
type: string
- in: query
name: page
description: The page of mailbox
required: true
type: string
- in: query
name: limit
description: The limit page of mailbox
type: string
- in: query
name: unread
description: The unread of mailbox
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowusermailbox2:
get:
tags:
- admin
description: |-
get usermailbox
<br>
operationId: FlowController.get user mailbox
parameters:
- in: query
name: uid
description: The id of user
required: true
type: string
- in: query
name: page
description: The page of mailbox
required: true
type: string
- in: query
name: limit
description: The limit page of mailbox
type: string
- in: query
name: unread
description: The unread of mailbox
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowworkflow:
post:
tags:
- admin
description: |-
post Workflow..
<br>
operationId: FlowController.post wf Workflow...
parameters:
- in: query
name: name
description: The name of Workflow
required: true
type: string
- in: query
name: dtid
description: The doctypeid of Workflow
required: true
type: string
- in: query
name: dsid
description: The docstate of Workflow
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/flowworkflowlist:
get:
tags:
- admin
description: |-
post Workflow..
<br>
operationId: FlowController.post wf Workflow...
parameters:
- in: query
name: page
description: The page of Workflow
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/getwxprojectconfig:
get:
tags:
- admin
description: |-
get wx projectconfig by projectid
<br>
operationId: AdminController.get wx projectconfig by projectid
parameters:
- in: query
name: projectid
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/admin/jsoneditor:
get:
tags:
- admin
description: |-
get wx projectconfig by projectid
<br>
operationId: AdminController.get wx projectconfig by projectid
parameters:
- in: query
name: projectid
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/admin/liucheng:
get:
tags:
- admin
description: |-
get flowchart text
<br>
operationId: FlowController.get flowchart text
parameters:
- in: query
name: docstate
description: The state of document
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/putwxprojectconfig:
post:
tags:
- admin
description: |-
put wx projectconfig by projectid
<br>
operationId: AdminController.update wx projectconfig by projectid
parameters:
- in: query
name: projectid
description: The id of project
required: true
type: string
- in: query
name: projectconfig
description: The json of projectconfig
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/admin/workflow:
get:
tags:
- admin
description: |-
show workflow page
<br>
operationId: FlowController.show wf page
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/admin/wxflowdoc:
post:
tags:
- admin
description: |-
post document
<br>
operationId: FlowController.post wf document
parameters:
- in: query
name: dtid
description: The doctypeid of document
required: true
type: string
- in: query
name: acid
description: The accesscontext of document
required: true
type: string
- in: query
name: gid
description: The groupid of Group
required: true
type: string
- in: query
name: name
description: The name of document
required: true
type: string
- in: query
name: id
description: The id of document
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/wxflownext:
post:
tags:
- admin
description: |-
post workflow next
<br>
operationId: FlowController.post wf next
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: daid
description: The id of action
required: true
type: string
- in: query
name: articleid
description: The id of article
required: true
type: string
- in: query
name: gid
description: The id of group
required: true
type: string
- in: query
name: messageid
description: The messageid of doc
required: true
type: string
- in: query
name: text
description: The text of apply
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/admin/wxflowusermailbox2:
get:
tags:
- admin
description: |-
get usermailbox
<br>
operationId: FlowController.get user mailbox
parameters:
- in: query
name: page
description: The page of mailbox
required: true
type: string
- in: query
name: limit
description: The limit page of mailbox
type: string
- in: query
name: unread
description: The unread of mailbox
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/adminlog/errlog:
get:
tags:
- adminlog
description: |-
get log list
<br>
operationId: AdminLogController.getAdminBlock
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page
"404":
description: page not found
/adminlog/infolog:
get:
tags:
- adminlog
description: |-
get log list
<br>
operationId: AdminLogController.getAdminBlock
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page
"404":
description: page not found
/ansys/addansysarticle/{id}:
post:
tags:
- ansys
description: |-
post article by ansysID
<br>
operationId: AnsysController.post ansys artile by ansysID
parameters:
- in: query
name: id
description: The id of ansysID
required: true
type: string
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ansysArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/ansys/ansysarticle/{id}:
get:
tags:
- ansys
description: |-
get article by ansysarticleID
<br>
operationId: AnsysController.get ansys artile by ansysarticleID
parameters:
- in: query
name: id
description: The id of ansysarticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ansysArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/ansys/ansyscal/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetansysPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/deleteansys/{id}:
post:
tags:
- ansys
description: |-
delete ansys by id
<br>
operationId: AnsysController.delete ansys
parameters:
- in: path
name: id
description: The id of ansys
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysApdl'
"400":
description: Invalid page supplied
"404":
description: ansys not found
/ansys/editoransysarticle/{id}:
get:
tags:
- ansys
description: |-
get editorarticle by ansysarticleID
<br>
operationId: AnsysController.get editoransys article by ansysarticleID
parameters:
- in: path
name: id
description: The id of ansysarticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ansysArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/ansys/getWxansyshistory/{id}:
get:
tags:
- ansys
description: |-
get ansyswxhistory..
<br>
operationId: AnsysController.get ansyswxhistory...
parameters:
- in: path
name: id
description: The id of ansyss
required: true
type: string
- in: query
name: searchText
description: The searchText of ansyshistory
type: string
- in: query
name: page
description: The page of ansyshistory
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysHistory'
"400":
description: Invalid page supplied
"404":
description: data not found
/ansys/getansys/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysApdl'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansys/ansyslist/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetansysPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansys/getansyss/{id}:
get:
tags:
- ansys
description: |-
get projproducts..
<br>
operationId: AnsysController.get projproducts...
parameters:
- in: path
name: id
description: The id of projproducts
required: true
type: string
- in: query
name: searchText
description: The searchText of projproducts
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysApdl'
"400":
description: Invalid page supplied
"404":
description: data not found
/ansys/getansys/getwxansyss/{id}:
get:
tags:
- ansys
description: |-
get ansyss..
<br>
operationId: AnsysController.get ansyss...
parameters:
- in: path
name: id
description: The class id of ansysansyss
required: true
type: string
- in: query
name: searchText
description: The searchText of ansysansyss
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysApdl'
"400":
description: Invalid page supplied
"404":
description: data not found
/ansys/getansyscalinput/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansyscaloutput/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansyshistory/{id}:
get:
tags:
- ansys
description: |-
get history..
<br>
operationId: AnsysController.get history...
parameters:
- in: path
name: id
description: The id of ansyss
required: true
type: string
- in: query
name: searchText
description: The searchText of ansyss
type: string
- in: query
name: page
description: The page of ansyss
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysHistory'
"400":
description: Invalid page supplied
"404":
description: data not found
/ansys/getansyshistorycal/{id}:
get:
tags:
- ansys
description: |-
get ansyscal history
<br>
operationId: AnsysController.get ansyscal history
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: path
name: ansysid
description: The id of ansyss
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetansysPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansyshistoryinput/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: ansysid
description: The id of ansysapdl
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansyshistoryoutput/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: ansysid
description: The id of ansysapdl
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getansyspdf/{id}:
get:
tags:
- ansys
description: |-
get ansyspdf by ansysapdlID
<br>
operationId: AnsysController.get ansyspdf by ansysapdlID
parameters:
- in: query
name: id
description: The id of ansysapdlID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ansysPdf'
"400":
description: Invalid page supplied
"404":
description: ansyspdf not found
/ansys/gethistoryansys/{id}:
get:
tags:
- ansys
description: |-
get ansys history
<br>
operationId: AnsysController.get ansys history
parameters:
- in: path
name: id
description: The id of ansyss
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysHistory'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getwxansyscalinput/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getwxansyscaloutput/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/getwxansysclass/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.get ansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/FileNode1'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/postansys:
post:
tags:
- ansys
description: |-
post ansys
<br>
operationId: AnsysController.post ansys
parameters:
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: ansysid
description: The ansysid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetansysPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/postwxansys:
post:
tags:
- ansys
description: |-
post ansys
<br>
operationId: AnsysController.post ansys
parameters:
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: ansysid
description: The ansysid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetansysPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/putansysapdl:
post:
tags:
- ansys
description: |-
put ansysapdl
<br>
operationId: AnsysController.put ansysapdl
parameters:
- in: path
name: id
description: The id of ansysapdl
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysApdl'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/putansyscalinput:
post:
tags:
- ansys
description: |-
put ansysinput
<br>
operationId: AnsysController.put ansysinput
parameters:
- in: path
name: id
description: The id of ansysinput
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/putansyscaloutput:
post:
tags:
- ansys
description: |-
put ansys output comment
<br>
operationId: AnsysController.put ansys outputcomment
parameters:
- in: path
name: id
description: The id of output
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AnsysOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/updateansysarticle:
post:
tags:
- ansys
description: |-
post editorarticle
<br>
operationId: AnsysController.post editoransys article
parameters:
- in: query
name: aid
description: The id of ansysarticle
required: true
type: string
- in: query
name: subtext
description: The subtext of ansysarticle
required: true
type: string
- in: query
name: content
description: The content of ansysarticle
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ansysArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/ansys/uploadansys/{id}:
get:
tags:
- ansys
description: |-
get ansys
<br>
operationId: AnsysController.upload ansysansys
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetansysPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/ansys/uploadansysapdl/{id}:
post:
tags:
- ansys
description: |-
post file by BootstrapFileInput
<br>
operationId: AnsysController.post bootstrapfileinput
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/ansys/wxansysarticle/{id}:
get:
tags:
- ansys
description: |-
get wxarticle by ansysarticleID
<br>
operationId: AnsysController.get wxansys artile by ansysarticleID
parameters:
- in: path
name: id
description: The id of ansysarticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ansysArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/bbs/bbs:
post:
tags:
- bbs
description: |-
post bbs
<br>
operationId: BbsController.post bbs
parameters:
- in: query
name: userId
description: The userId for bbs
required: true
type: string
- in: query
name: desc
description: The description for bbs
required: true
type: string
- in: query
name: year
description: The year for bbs
required: true
type: string
- in: query
name: month
description: The month for bbs
required: true
type: string
- in: query
name: day
description: The day for bbs
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: bbs not found
/bbs/bbsgetbbs:
get:
tags:
- bbs
description: |-
get bbs
<br>
operationId: BbsController.get checkin bbs
parameters:
- in: query
name: year
description: The year for bbs
required: true
type: string
- in: query
name: month
description: The month for bbs
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/bbs/getbbs:
get:
tags:
- bbs
description: |-
get bbs
<br>
operationId: BbsController.get bbs
parameters:
- in: query
name: year
description: The year for bbs
required: true
type: string
- in: query
name: month
description: The month for bbs
required: true
type: string
- in: query
name: day
description: The day for bbs
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: bbs not found
/cart/createproductcart:
post:
tags:
- cart
description: |-
post create a new cart
<br>
operationId: CartController.post create a new cart
parameters:
- in: query
name: ids
description: The ids of product
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/cart/deleteusercart:
post:
tags:
- cart
description: |-
post delete a usercart
<br>
operationId: CartController.post delete usercart
parameters:
- in: query
name: ids
description: The ids of usercats
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.DeleteUserCart'
"400":
description: Invalid page supplied
"404":
description: articl not found
/cart/getapplycart:
get:
tags:
- cart
description: |-
get usercartlist
<br>
operationId: CartController.get usercartlist
parameters:
- in: query
name: status
description: The status for usercart list
required: true
type: string
- in: query
name: searchText
description: The searchText of usercart
type: string
- in: query
name: pageNo
description: The page for usercart list
required: true
type: string
- in: query
name: limit
description: The limit of page for usercart list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Create'
"400":
description: Invalid page supplied
"404":
description: cart not found
/cart/getapprovalcart:
get:
tags:
- cart
description: |-
get usercartlist
<br>
operationId: CartController.get usercartlist
parameters:
- in: query
name: status
description: The status for usercart list
required: true
type: string
- in: query
name: searchText
description: The searchText of usercart
type: string
- in: query
name: pageNo
description: The page for usercart list
required: true
type: string
- in: query
name: limit
description: The limit of page for usercart list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Create'
"400":
description: Invalid page supplied
"404":
description: cart not found
/cart/getcart:
get:
tags:
- cart
description: |-
get usercarttpl
<br>
operationId: CartController.get usercarttpl
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Create'
"400":
description: Invalid page supplied
"404":
description: cart not found
/cart/updateapprovalcart:
post:
tags:
- cart
description: |-
post update carts
<br>
operationId: CartController.post update carts
parameters:
- in: query
name: ids
description: The ids of approvalcats
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Update'
"400":
description: Invalid page supplied
"404":
description: cart not found
/chat/avatar/{text}:
get:
tags:
- chat
description: |-
get avatar
<br>
operationId: ChatController.get avatar
parameters:
- in: path
name: text
description: The text
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Avatar'
"400":
description: Invalid page supplied
"404":
description: Page not found
/chat/chat:
get:
tags:
- chat
description: |-
get chat
<br>
operationId: ChatController.get chat
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: Page not found
/chat/wschat:
get:
tags:
- chat
description: |-
get wschat
<br>
operationId: ChatController.get wschat
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetChat'
"400":
description: Invalid page supplied
"404":
description: Page not found
/checkin/activity/actInfo:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/activity/apply:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/activity/create:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin activity
parameters:
- in: query
name: CreaterId
description: The CreaterId for activity
required: true
type: string
- in: query
name: projectid
description: The projectid of activity
required: true
type: string
- in: query
name: Caption
description: The Caption for activity
required: true
type: string
- in: query
name: Desc
description: The Desc for activity
required: true
type: string
- in: query
name: Location
description: The Location for activity
required: true
type: string
- in: query
name: Lat
description: The Lat for activity
required: true
type: string
- in: query
name: Lng
description: The Lng for activity
required: true
type: string
- in: query
name: SatrtDate
description: The SatrtDate for activity
required: true
type: string
- in: query
name: EndDate
description: The EndDate for activity
required: true
type: string
- in: query
name: IfFace
description: The IfFace for activity
required: true
type: string
- in: query
name: IfPhoto
description: The IfPhoto for activity
required: true
type: string
- in: query
name: IfLocation
description: The IfLocation for activity
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/activity/getall:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: projectid
description: The projectid of activity
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/activity/haveApply:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/activity/like:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/check:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: userId
description: The userId for person
required: true
type: string
- in: query
name: activityId
description: The activityId of activity
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/check/compare:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/check/details:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/checkgetcheck:
get:
tags:
- checkin
description: |-
get check
<br>
operationId: CheckController.get checkin check
parameters:
- in: query
name: userId
description: The userId for check
required: true
type: string
- in: query
name: activityId
description: The activityid for check
required: true
type: string
- in: query
name: year
description: The year for check
required: true
type: string
- in: query
name: month
description: The month for check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/checksignature:
get:
tags:
- checkin
description: |-
get Signature
<br>
operationId: CheckController.get Signature
parameters:
- in: query
name: signature
description: The signature of wx
required: true
type: string
- in: query
name: timestamp
description: The timestamp of wx
required: true
type: string
- in: query
name: nonce
description: The nonce for wx
required: true
type: string
- in: query
name: echostr
description: The echostr for wx
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/details/date:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/monthcheck:
get:
tags:
- checkin
description: |-
get monthcheck
<br>
operationId: CheckController.get mothcheckin
parameters:
- in: query
name: page
description: The page of check
type: string
- in: query
name: limit
description: The size of check
type: string
- in: query
name: activityId
description: The activityid for check
required: true
type: string
- in: query
name: year
description: The year for check
required: true
type: string
- in: query
name: month
description: The month for check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/monthchecksum:
get:
tags:
- checkin
description: |-
get monthcheck
<br>
operationId: CheckController.get mothcheckin
parameters:
- in: query
name: projectid
description: The projectid of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/person:
post:
tags:
- checkin
description: |-
post person
<br>
operationId: CheckController.post checkin person
parameters:
- in: query
name: username
description: The userId for person
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/sendmessage:
post:
tags:
- checkin
description: |-
post Message
<br>
operationId: CheckController.post Message
parameters:
- in: query
name: app_version
description: The app_version of wx
required: true
type: string
- in: query
name: template_id
description: The template_id of Message
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/checkin/subscribemessage:
post:
tags:
- checkin
description: |-
post subscribemessage
<br>
operationId: CheckController.post subscribemessage
parameters:
- in: query
name: tmplIds
description: The tmplIds of wx
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/elastic/get:
get:
tags:
- elastic
description: |-
get elasticsearch web
<br>
operationId: ElasticController.get elasticsearch web
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetElastic'
"400":
description: Invalid page supplied
"404":
description: Elastic not found
/elastic/search:
get:
tags:
- elastic
description: |-
get earch
<br>
operationId: ElasticController.get search
parameters:
- in: query
name: q
description: The query=
type: string
- in: formData
name: a
description: The after...
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetSearch'
"400":
description: Invalid page supplied
"404":
description: Search not found
/elastic/tika:
post:
tags:
- elastic
description: |-
post tika
<br>
operationId: ElasticController.post tika
responses:
"400":
description: Invalid page supplied
"404":
description: Tika not found
/elastic/upload/{id}:
post:
tags:
- elastic
description: |-
post file by BootstrapFileInput
<br>
operationId: ElasticController.post bootstrapfileinput
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
- in: query
name: prodlabel
description: The prodlabel
type: string
- in: query
name: prodprincipal
description: The prodprincipal
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/elastic/uploadelastic/{id}:
get:
tags:
- elastic
description: |-
get upload file to tika&elastic html
<br>
operationId: ElasticController.upload file to tika&elastic html
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Elastic'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecostarchi/{id}:
get:
tags:
- estimate
description: |-
get costarchi tpl
<br>
operationId: EstimateController.getestimatecostarchi
parameters:
- in: path
name: id
description: The id of projectphase
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostArchi'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecostarchidata/{id}:
get:
tags:
- estimate
description: |-
get costArchi
<br>
operationId: EstimateController.getEstimateCostArchiData
parameters:
- in: path
name: id
description: The id of estimatephase
required: true
type: string
- in: query
name: page
description: The page of estimatecostarchi
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostArchi'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecostelect/{id}:
get:
tags:
- estimate
description: |-
get costelect tpl
<br>
operationId: EstimateController.getestimatecostelect
parameters:
- in: path
name: id
description: The id of projectphase
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostElect'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecostelectdata/{id}:
get:
tags:
- estimate
description: |-
get costArchi
<br>
operationId: EstimateController.getEstimateCostElectData
parameters:
- in: path
name: id
description: The id of estimatephase
required: true
type: string
- in: query
name: page
description: The page of estimatecostelect
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostElect'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecostmetal/{id}:
get:
tags:
- estimate
description: |-
get costmetal tpl
<br>
operationId: EstimateController.getestimatecostmetal
parameters:
- in: path
name: id
description: The id of projectphase
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostMetal'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecostmetaldata/{id}:
get:
tags:
- estimate
description: |-
get costArchi
<br>
operationId: EstimateController.getEstimateCostMetalData
parameters:
- in: path
name: id
description: The id of estimatephase
required: true
type: string
- in: query
name: page
description: The page of estimatecostmetal
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostMetal'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecosttemp/{id}:
get:
tags:
- estimate
description: |-
get costtemp tpl
<br>
operationId: EstimateController.getestimatecostTemp
parameters:
- in: path
name: id
description: The id of projectphase
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostTemp'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimatecosttempdata/{id}:
get:
tags:
- estimate
description: |-
get costTemp
<br>
operationId: EstimateController.getEstimateCostTempData
parameters:
- in: path
name: id
description: The id of estimatephase
required: true
type: string
- in: query
name: page
description: The page of estimatecosttemp
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateCostTemp'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimateprojects:
get:
tags:
- estimate
description: |-
get estimateprojects
<br>
operationId: EstimateController.getestimateprojects
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateProjects'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/getestimateprojectsdata:
get:
tags:
- estimate
description: |-
get estimateProjects Data
<br>
operationId: EstimateController.getEstimateProjectsData
parameters:
- in: query
name: page
description: The page of estimateProjectsData
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetEstimateProjectsData'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/updateestproject:
post:
tags:
- estimate
description: |-
add user
<br>
operationId: EstimateController.update user
parameters:
- in: query
name: pk
description: The pk of user
type: string
- in: query
name: name
description: The name of user
type: string
- in: query
name: value
description: The value of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/estimate/uploadexcel:
get:
tags:
- estimate
description: |-
get excel
<br>
operationId: EstimateController.upload excel
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/estimate/uploadexcelestimate:
post:
tags:
- estimate
description: |-
post file by BootstrapFileInput
<br>
operationId: EstimateController.post bootstrapfileinput
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/excel/addexcelarticle/{id}:
post:
tags:
- excel
description: |-
post article by excelID
<br>
operationId: ExcelController.post excel artile by excelID
parameters:
- in: query
name: id
description: The id of excelID
required: true
type: string
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.excelArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/excel/deleteexcel/{id}:
post:
tags:
- excel
description: |-
delete excel by id
<br>
operationId: ExcelController.delete excel
parameters:
- in: path
name: id
description: The id of excel
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelTemple'
"400":
description: Invalid page supplied
"404":
description: excel not found
/excel/editorexcelarticle/{id}:
get:
tags:
- excel
description: |-
get editorarticle by excelarticleID
<br>
operationId: ExcelController.get editorexcel article by excelarticleID
parameters:
- in: path
name: id
description: The id of excelarticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.excelArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/excel/excelarticle/{id}:
get:
tags:
- excel
description: |-
get article by excelarticleID
<br>
operationId: ExcelController.get excel artile by excelarticleID
parameters:
- in: query
name: id
description: The id of excelarticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.excelArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/excel/excelcal/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of excel
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcel/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelTemple'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcel/excellist/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcel/getexcels/{id}:
get:
tags:
- excel
description: |-
get projproducts..
<br>
operationId: ExcelController.get projproducts...
parameters:
- in: path
name: id
description: The id of projproducts
required: true
type: string
- in: query
name: searchText
description: The searchText of projproducts
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/excel/getexcel/getwxexcels/{id}:
get:
tags:
- excel
description: |-
get excels..
<br>
operationId: ExcelController.get excels...
parameters:
- in: path
name: id
description: The class id of excelexcels
required: true
type: string
- in: query
name: searchText
description: The searchText of excelexcels
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/excel/getexcelcalinput/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcelcaloutput/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcelhistory/{id}:
get:
tags:
- excel
description: |-
get history..
<br>
operationId: ExcelController.get history...
parameters:
- in: path
name: id
description: The id of excels
required: true
type: string
- in: query
name: searchText
description: The searchText of excels
type: string
- in: query
name: page
description: The page of excels
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelHistory'
"400":
description: Invalid page supplied
"404":
description: data not found
/excel/getexcelhistorycal/{id}:
get:
tags:
- excel
description: |-
get excelcal history
<br>
operationId: ExcelController.get excelcal history
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: path
name: excelid
description: The id of excels
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcelhistoryinput/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: excelid
description: The id of exceltemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexcelhistoryoutput/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: excelid
description: The id of exceltemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getexceltemplepdf/{id}:
get:
tags:
- excel
description: |-
get excelpdf by exceltempleID
<br>
operationId: ExcelController.get excelpdf by exceltempleID
parameters:
- in: query
name: id
description: The id of exceltempleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.excelPdf'
"400":
description: Invalid page supplied
"404":
description: excelpdf not found
/excel/gethistoryexcel/{id}:
get:
tags:
- excel
description: |-
get excel history
<br>
operationId: ExcelController.get excel history
parameters:
- in: path
name: id
description: The id of excels
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelHistory'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getwxexcelcalinput/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getwxexcelcaloutput/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getwxexcelclass/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/FileNode1'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getwxexcelhistory/{id}:
get:
tags:
- excel
description: |-
get excelwxhistory..
<br>
operationId: ExcelController.get excelwxhistory...
parameters:
- in: path
name: id
description: The id of excels
required: true
type: string
- in: query
name: searchText
description: The searchText of excelhistory
type: string
- in: query
name: page
description: The page of excelhistory
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelHistory'
"400":
description: Invalid page supplied
"404":
description: data not found
/excel/getwxexcelhistoryinput/{id}:
get:
tags:
- excel
description: |-
get wxexcelhistoryinput
<br>
operationId: ExcelController.get wxexcelhistoryinput
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: templeid
description: The id of exceltemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/getwxexcelhistoryoutput/{id}:
get:
tags:
- excel
description: |-
get wxexcelhistoryoutput
<br>
operationId: ExcelController.get wxexcelhistoryoutput
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: templeid
description: The id of exceltemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/postexcel/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.get excel
parameters:
- in: path
name: id
description: The id of exceltemple
required: true
type: string
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: excelid
description: The excelid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/postexcel_back:
post:
tags:
- excel
description: |-
post excel
<br>
operationId: ExcelController.post excel
parameters:
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: excelid
description: The excelid
type: string
- in: query
name: description
description: The excelcal description
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/postwxexcel/{id}:
get:
tags:
- excel
description: |-
post excelcal
<br>
operationId: ExcelController.post excelcal
parameters:
- in: path
name: id
description: The id of exceltemple
required: true
type: string
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: excelid
description: The excelid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/postwxexcel_back:
post:
tags:
- excel
description: |-
post excel
<br>
operationId: ExcelController.post excel
parameters:
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: templeid
description: The exceltempleid
type: string
- in: query
name: description
description: The excelcal description
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/putexcelcalinput:
post:
tags:
- excel
description: |-
put excelinput
<br>
operationId: ExcelController.put excelinput
parameters:
- in: path
name: id
description: The id of excelinput
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/putexcelcaloutput:
post:
tags:
- excel
description: |-
put excel output comment
<br>
operationId: ExcelController.put excel outputcomment
parameters:
- in: path
name: id
description: The id of output
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/putexceltemple:
post:
tags:
- excel
description: |-
put exceltemple
<br>
operationId: ExcelController.put exceltemple
parameters:
- in: path
name: id
description: The id of exceltemple
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.ExcelTemple'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/updateexcelarticle:
post:
tags:
- excel
description: |-
post editorarticle
<br>
operationId: ExcelController.post editorexcel article
parameters:
- in: query
name: aid
description: The id of excelarticle
required: true
type: string
- in: query
name: subtext
description: The subtext of excelarticle
required: true
type: string
- in: query
name: content
description: The content of excelarticle
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.excelArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/excel/uploadexcel/{id}:
get:
tags:
- excel
description: |-
get excel
<br>
operationId: ExcelController.upload excelexcel
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetexcelPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/uploadexceltemple/{id}:
post:
tags:
- excel
description: |-
post file by BootstrapFileInput
<br>
operationId: ExcelController.post bootstrapfileinput
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/excel/uploadexceltemple_excelize/{id}:
post:
tags:
- excel
description: |-
post file by BootstrapFileInput
<br>
operationId: ExcelController.post bootstrapfileinput
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/excel/wsexcelcal/{id}:
get:
tags:
- excel
description: |-
get excelwscal
<br>
operationId: WsExcelcalController.get excelwscal
parameters:
- in: path
name: id
description: The id of exceltemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetExcel'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/wswxexcelcal/{id}:
get:
tags:
- excel
description: |-
get wxexcelcal
<br>
operationId: WsExcelcalController.get wxexcelcal
parameters:
- in: path
name: id
description: The id of exceltemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetExcel'
"400":
description: Invalid page supplied
"404":
description: Page not found
/excel/wxexcelarticle/{id}:
get:
tags:
- excel
description: |-
get wxarticle by excelarticleID
<br>
operationId: ExcelController.get wxexcel artile by excelarticleID
parameters:
- in: path
name: id
description: The id of excelarticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.excelArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/fileinput/bootstrapfileinput:
post:
tags:
- fileinput
description: |-
post file by BootstrapFileInput
<br>
operationId: FileinputController.post bootstrapfileinput
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: articl not found
/flv/:
get:
tags:
- flv
description: |-
get admin page
<br>
operationId: FlvController.getFlv
parameters:
- in: query
name: filepath
description: The mp4 file path
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page
"404":
description: page not found
/flv/flvlist:
get:
tags:
- flv
description: |-
get admin page
<br>
operationId: FlvController.getFlv
parameters:
- in: query
name: xxl_sso_token
description: The tokenText
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page
"404":
description: page not found
/freecad/editfcmodel:
post:
tags:
- freecad
description: |-
post freecad model parameter json by ajax
<br>
operationId: FreeCADController.post freecad model parameter
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/freecad/editfcmodelbatch:
post:
tags:
- freecad
description: |-
post freecad model parameter batch json by ajax
<br>
operationId: FreeCADController.post freecad model parameter batch
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/freecad/freecad:
get:
tags:
- freecad
description: |-
get mathcad temples by keyword...
<br>
operationId: FreeCADController.get mathcad temples by keyword...
parameters:
- in: query
name: keyword
description: The keyword of mathtemples
type: string
- in: query
name: page
description: The page of mathtemples
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/freecaddata:
get:
tags:
- freecad
description: |-
get freecad models by keyword...
<br>
operationId: FreeCADController.get freecad models by keyword...
parameters:
- in: query
name: keyword
description: The keyword of freecad models
type: string
- in: query
name: page
description: The page of freecad models
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/freecadmenu:
get:
tags:
- freecad
description: |-
get mathcad temples by keyword...
<br>
operationId: FreeCADController.get mathcad temples by keyword...
parameters:
- in: query
name: keyword
description: The keyword of mathtemples
type: string
- in: query
name: page
description: The page of mathtemples
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/freecadmodel/{id}:
get:
tags:
- freecad
description: |-
get freecad model by id...
<br>
operationId: FreeCADController.get freecad model by id...
parameters:
- in: path
name: id
description: The id of freecad model
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.FreecadModel'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/getfcparameter/{id}:
get:
tags:
- freecad
description: |-
get freecad parameter
<br>
operationId: FreeCADController.get freecad parameter
parameters:
- in: query
name: id
description: The model id
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.FreecadInputs'
"400":
description: Invalid page supplied
"404":
description: model not found
/freecad/getfreecad/{id}:
get:
tags:
- freecad
description: |-
get mathcad temples by keyword...
<br>
operationId: FreeCADController.get mathcad temples by keyword...
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/online3deditor:
get:
tags:
- freecad
description: |-
online3deditor
<br>
operationId: FreeCADController.get online3deditor
parameters:
- in: query
name: model
description: The model url
required: true
type: string
- in: query
name: id
description: The model id
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/freecad/online3dview:
get:
tags:
- freecad
description: |-
getthreejs
<br>
operationId: FreeCADController.get online3dview
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/freecad/online3dviewembed:
get:
tags:
- freecad
description: |-
getthreejs
<br>
operationId: FreeCADController.get online3dviewembed
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/freecad/turbin:
get:
tags:
- freecad
description: |-
getthreejs
<br>
operationId: FreeCADController.get turbin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/freecad/update:
get:
tags:
- freecad
description: |-
get freecad model update by id...
<br>
operationId: FreeCADController.get freecad model update by id...
parameters:
- in: query
name: id
description: The id of freecad
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/updatefreecad:
post:
tags:
- freecad
description: |-
post freecad model update by id...
<br>
operationId: FreeCADController.post freecad model update by id...
parameters:
- in: query
name: id
description: The id of freecad
required: true
type: string
- in: query
name: description
description: The description of freecad
type: string
- in: query
name: indicators
description: The indicators of freecad
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.FreeCAD'
"400":
description: Invalid page supplied
"404":
description: data not found
/freecad/uploadmodel:
get:
tags:
- freecad
description: |-
get mathcad
<br>
operationId: FreeCADController.upload mathcadtemple
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/freecad/uploadmodelfile:
post:
tags:
- freecad
description: |-
post file by BootstrapFileInput
<br>
operationId: FreeCADController.post bootstrapfileinput
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/mathcad/addmatharticle/{id}:
post:
tags:
- mathcad
description: |-
post article by templeID
<br>
operationId: MathcadController.post math artile by templeID
parameters:
- in: query
name: id
description: The id of templeID
required: true
type: string
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.MathArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/mathcad/deletetemple/{id}:
post:
tags:
- mathcad
description: |-
delete temple by id
<br>
operationId: MathcadController.delete temple
parameters:
- in: path
name: id
description: The id of temple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: temple not found
/mathcad/editormatharticle/{id}:
get:
tags:
- mathcad
description: |-
get editorarticle by matharticleID
<br>
operationId: MathcadController.get editormath article by matharticleID
parameters:
- in: path
name: id
description: The id of matharticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.MathArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/mathcad/gethistoryinput/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: templeid
description: The id of usertemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/gethistorymath/{id}:
get:
tags:
- mathcad
description: |-
get mathcad history
<br>
operationId: MathcadController.get mathcad history
parameters:
- in: path
name: id
description: The id of temples
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserHistory'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/gethistorymathcal/{id}:
get:
tags:
- mathcad
description: |-
get mathcal history
<br>
operationId: MathcadController.get mathcal history
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: path
name: templeid
description: The id of temples
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/gethistoryoutput/{id}:
get:
tags:
- mathcad
description: |-
get mathcad history output
<br>
operationId: MathcadController.get mathcad history output
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: templeid
description: The id of usertemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/gethistorytemples/{id}:
get:
tags:
- mathcad
description: |-
get history..
<br>
operationId: MathcadController.get history...
parameters:
- in: path
name: id
description: The id of temples
required: true
type: string
- in: query
name: searchText
description: The searchText of temples
type: string
- in: query
name: page
description: The page of temples
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserHistory'
"400":
description: Invalid page supplied
"404":
description: data not found
/mathcad/getmath/{id}:
get:
tags:
- mathcad
description: |-
get mathcadclass
<br>
operationId: MathcadController.get mathcadclass
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getmath/gettemples/{id}:
get:
tags:
- mathcad
description: |-
get mathcad temples..
<br>
operationId: MathcadController.get mathcad temples by classid...
parameters:
- in: path
name: id
description: The id of class
required: true
type: string
- in: query
name: searchText
description: The searchText of temples
type: string
- in: query
name: page
description: The page of temples
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/mathcad/getmath/getwxtemples/{id}:
get:
tags:
- mathcad
description: |-
get mathtemples..
<br>
operationId: MathcadController.get mathtemples...
parameters:
- in: path
name: id
description: The class id of mathtemples
required: true
type: string
- in: query
name: searchText
description: The searchText of mathtemples
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: data not found
/mathcad/getmath/templelist/{id}:
get:
tags:
- mathcad
description: |-
get mathcad templelist
<br>
operationId: MathcadController.get mathcad templelist
parameters:
- in: path
name: id
description: The id of templeclass
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getmathcalinput/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getmathcaloutput/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getmathtemplepdf/{id}:
get:
tags:
- mathcad
description: |-
get mathpdf by usertempleID
<br>
operationId: MathcadController.get mathpdf by usertempleID
parameters:
- in: query
name: id
description: The id of usertempleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.MathPdf'
"400":
description: Invalid page supplied
"404":
description: mathpdf not found
/mathcad/getwxhistoryinput/{id}:
get:
tags:
- mathcad
description: |-
get wxmathcadhistoryinput
<br>
operationId: MathcadController.get wxmathcadhistoryinput
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: templeid
description: The id of usertemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getwxhistoryoutput/{id}:
get:
tags:
- mathcad
description: |-
get wxmathcad history output
<br>
operationId: MathcadController.get wxmathcad history output
parameters:
- in: path
name: id
description: The id of history
required: true
type: string
- in: query
name: templeid
description: The id of usertemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getwxhistorytemples/{id}:
get:
tags:
- mathcad
description: |-
get mathwxhistory..
<br>
operationId: MathcadController.get mathwxhistory...
parameters:
- in: path
name: id
description: The id of temples
required: true
type: string
- in: query
name: searchText
description: The searchText of mathhistory
type: string
- in: query
name: page
description: The page of mathhistory
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserHistory'
"400":
description: Invalid page supplied
"404":
description: data not found
/mathcad/getwxmathcalinput/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getwxmathcaloutput/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/getwxmathclass/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcadclass
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/FileNode1'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/matharticle/{id}:
get:
tags:
- mathcad
description: |-
get article by matharticleID
<br>
operationId: MathcadController.get math artile by matharticleID
parameters:
- in: query
name: id
description: The id of matharticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.MathArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/mathcad/mathcal/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/postmath:
post:
tags:
- mathcad
description: |-
post mathcad
<br>
operationId: MathcadController.post mathcad
parameters:
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: templeid
description: The templeid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/postmath2/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of usertemple
required: true
type: string
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: templeid
description: The templeid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/postmathback:
post:
tags:
- mathcad
description: |-
post mathcad
<br>
operationId: MathcadController.post mathcad
parameters:
- in: query
name: firstInput
description: The firstInput
type: string
- in: query
name: secondInput
description: The secondInput
type: string
- in: query
name: thirdInput
description: The thirdInput
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/postwxmath:
post:
tags:
- mathcad
description: |-
post mathcad
<br>
operationId: MathcadController.post mathcad
parameters:
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: templeid
description: The templeid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/postwxmath2/{id}:
get:
tags:
- mathcad
description: |-
post mathcadcal
<br>
operationId: MathcadController.post mathcadcal
parameters:
- in: path
name: id
description: The id of usertemple
required: true
type: string
- in: query
name: inputdata
description: The inputdata
type: string
- in: query
name: templeid
description: The templeid
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/putmathcalinput:
post:
tags:
- mathcad
description: |-
put mathinput
<br>
operationId: MathcadController.put mathinput
parameters:
- in: path
name: id
description: The id of mathinput
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleInputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/putmathcaloutput:
post:
tags:
- mathcad
description: |-
put mathcad output comment
<br>
operationId: MathcadController.put mathcad outputcomment
parameters:
- in: path
name: id
description: The id of output
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.TempleOutputs'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/putusertemple:
post:
tags:
- mathcad
description: |-
put usertemple
<br>
operationId: MathcadController.put usertemple
parameters:
- in: path
name: id
description: The id of usertemple
required: true
type: string
- in: query
name: value
description: The value
required: true
type: string
- in: query
name: name
description: The name
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UserTemple'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/updatearticle:
post:
tags:
- mathcad
description: |-
post editorarticle
<br>
operationId: MathcadController.post editormath article
parameters:
- in: query
name: aid
description: The id of matharticle
required: true
type: string
- in: query
name: subtext
description: The subtext of matharticle
required: true
type: string
- in: query
name: content
description: The content of matharticle
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.MathArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/mathcad/uploadmathtemple/{id}:
post:
tags:
- mathcad
description: |-
post file by BootstrapFileInput
<br>
operationId: MathcadController.post bootstrapfileinput
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/mathcad/uploadtemple/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: MathcadController.upload mathcadtemple
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/wsmathcal/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: WsMathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of mathtemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/wswxmathcal/{id}:
get:
tags:
- mathcad
description: |-
get mathcad
<br>
operationId: WsMathcadController.get mathcad
parameters:
- in: path
name: id
description: The id of mathtemple
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetMathcadPage'
"400":
description: Invalid page supplied
"404":
description: Page not found
/mathcad/wxmatharticle/{id}:
get:
tags:
- mathcad
description: |-
get wxarticle by matharticleID
<br>
operationId: MathcadController.get wxmath artile by matharticleID
parameters:
- in: path
name: id
description: The id of matharticleID
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.MathArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/onlyoffice/addpermission:
post:
tags:
- onlyoffice
description: |-
post Addpermission..
<br>
operationId: OnlyController.post Addpermission...
parameters:
- in: query
name: ids
description: The id of document
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetOnlyoffice'
"400":
description: Invalid page supplied
"404":
description: data not found
/onlyoffice/commandbuilder:
get:
tags:
- onlyoffice
description: |-
get doc to onlyoffice CommandService
<br>
operationId: OnlyController.get CommandService builder doc
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/onlyoffice/commanddrop/{id}:
get:
tags:
- onlyoffice
description: |-
get doc to onlyoffice CommandService
<br>
operationId: OnlyController.get CommandService drop doc
parameters:
- in: path
name: id
description: The id of user
required: true
type: string
- in: query
name: key
description: The key of doc
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/onlyoffice/commandinfo/{id}:
get:
tags:
- onlyoffice
description: |-
post doc to onlyoffice CommandService
<br>
operationId: OnlyController.post CommandService info doc
parameters:
- in: path
name: id
description: The id of doc
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: article not found
/onlyoffice/conversion:
post:
tags:
- onlyoffice
description: |-
post doc to onlyoffice conversion
<br>
operationId: OnlyController.post conversion doc
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/onlyoffice/downloadonlydoc:
post:
tags:
- onlyoffice
description: |-
post download onlydoc by id
<br>
operationId: OnlyController.post download onlydoc
parameters:
- in: query
name: id
description: The id of onlydoc
required: true
type: string
- in: query
name: url
description: The url of onlyofficeserver
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/onlyoffice/downloadzip:
post:
tags:
- onlyoffice
description: |-
download zip
<br>
operationId: OnlyController.download zip
parameters:
- in: query
name: ids
description: The ids of onlydoc
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Onlyoffice'
"400":
description: Invalid page supplied
"404":
description: articl not found
/onlyoffice/onlyoffice:
get:
tags:
- onlyoffice
description: |-
get onlyoffice
<br>
operationId: OnlyController.get onlyoffce
parameters:
- in: query
name: id
description: The id of office
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Ollyoffice'
"400":
description: Invalid page supplied
"404":
description: office not found
/pdfcpu/addwatermarks/{id}:
post:
tags:
- pdfcpu
description: |-
post signature to pdf
<br>
operationId: PdfCpuController.post signature to pdf
parameters:
- in: path
name: id
description: The id of pdf
required: true
type: string
- in: query
name: pageNumber
description: The pageNumber of pdf
type: string
- in: query
name: numPages
description: The numPages of pdf
required: true
type: string
- in: query
name: offsetdx
description: The offsetdx of signature
type: string
- in: query
name: offsetdy
description: The offsetdy of signature
type: string
- in: query
name: scale
description: The scale of signature
type: string
- in: query
name: image
description: The base64 image of signature
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddSignature'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/pdfcpu/onlypdf/{id}:
get:
tags:
- pdfcpu
description: |-
get only pdf
<br>
operationId: PdfCpuController.get only pdf
parameters:
- in: path
name: id
description: The id of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetOnlyPdf'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/pdfcpu/test/{id}:
get:
tags:
- pdfcpu
description: |-
get test
<br>
operationId: PdfCpuController.get test
parameters:
- in: path
name: id
description: The id of test
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Gettest'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/project/getprojects:
get:
tags:
- project
description: |-
get projectlist..
<br>
operationId: ProjController.get cms projectlist...
parameters:
- in: query
name: projectid
description: The id of project
type: string
- in: query
name: searchText
description: The searchText of project
type: string
- in: query
name: pageNo
description: The page of projectlist
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/getprojecttree/{id}:
get:
tags:
- project
description: |-
get projecttree..
<br>
operationId: ProjController.get cms projecttree...
parameters:
- in: path
name: id
description: The id of projecttree
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/getwxprojects:
get:
tags:
- project
description: |-
get projectlist..
<br>
operationId: ProjController.get wx projectlist...
parameters:
- in: query
name: projectid
description: The id of project
type: string
- in: query
name: pageNo
description: The page of projectlist
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/product/addpermission:
post:
tags:
- project
description: |-
get Addpermission..
<br>
operationId: ProdController.get Addpermission...
parameters:
- in: query
name: ids
description: The id of attachment
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/product/getpermission:
get:
tags:
- project
description: |-
get Getpermission..
<br>
operationId: ProdController.get Getpermission...
parameters:
- in: query
name: docid
description: The id of attachment
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/product/officeview/{id}:
get:
tags:
- project
description: |-
get OfficeView..
<br>
operationId: ProdController.get OfficeView...
parameters:
- in: path
name: id
description: The id of OfficeView
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/product/officeviewcallback:
get:
tags:
- project
description: |-
get OfficeViewCallback..
<br>
operationId: ProdController.get OfficeViewCallback...
parameters:
- in: query
name: id
description: The id of attachment
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/project/{id}/timeline:
get:
tags:
- project
description: |-
get aproject time line
<br>
operationId: ProjController.get project time line
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAllProjCalendar'
"400":
description: Invalid page supplied
"404":
description: articls not found
/project/project/gettimeline/{id}:
get:
tags:
- project
description: |-
get aproject time line
<br>
operationId: ProjController.get project time line
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAllProjCalendar'
"400":
description: Invalid page supplied
"404":
description: articls not found
/project/project/products/{id}:
get:
tags:
- project
description: |-
get projproducts..
<br>
operationId: ProdController.get projproducts...
parameters:
- in: path
name: id
description: The id of projproducts
required: true
type: string
- in: query
name: searchText
description: The searchText of projproducts
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/projectuserrole:
get:
tags:
- project
description: |-
get project user permission
<br>
operationId: ProjController.get project user permission
parameters:
- in: query
name: pid
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProjectPage'
"400":
description: Invalid page supplied
"404":
description: project not found
/project/quickaddwxproject:
post:
tags:
- project
description: |-
post quickaddproject..
<br>
operationId: ProjController.post wx quickaddproject...
parameters:
- in: query
name: projectcode
description: The projectcode of project
required: true
type: string
- in: query
name: projecttitle
description: The projecttitle of project
required: true
type: string
- in: query
name: tempprojid
description: The tempprojid of project
required: true
type: string
- in: query
name: istemppermission
description: The permission of project
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/project/userprojecteditortree:
get:
tags:
- project
description: |-
get user project editor tree
<br>
operationId: ProjController.get user project editor tree
parameters:
- in: query
name: pid
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProjectPage'
"400":
description: Invalid page supplied
"404":
description: project not found
/project/userprojectpermission:
get:
tags:
- project
description: |-
get user project editor tree
<br>
operationId: ProjController.get user project editor tree
parameters:
- in: query
name: pid
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProjectPage'
"400":
description: Invalid page supplied
"404":
description: project not found
/share/browse:
get:
tags:
- share
description: |-
get a share
<br>
operationId: ShareController.get a share
parameters:
- in: query
name: shareUuid
description: The shareUuid of share
required: true
type: string
- in: query
name: code
description: The code of share
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/share/create:
post:
tags:
- share
description: |-
post create a new share
<br>
operationId: ShareController.post create a new share
parameters:
- in: query
name: matterUuids
description: The matterUuids of share
required: true
type: string
- in: query
name: expireInfinity
description: The xpireInfinity of share
required: true
type: boolean
- in: query
name: expireTime
description: The expireTime of share
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/share/detail/{id}:
get:
tags:
- share
description: |-
get a share
<br>
operationId: ShareController.get a share
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/share/download:
post:
tags:
- share
description: |-
download a share
<br>
operationId: ShareController.download a share
parameters:
- in: query
name: shareUuid
description: The shareUuid of share
required: true
type: string
- in: query
name: code
description: The code of share
type: string
- in: query
name: id
description: The id of attachment
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Attachment'
"400":
description: Invalid page supplied
"404":
description: articl not found
/share/downloadzip:
post:
tags:
- share
description: |-
download a share
<br>
operationId: ShareController.download a share
parameters:
- in: query
name: shareUuid
description: The shareUuid of share
required: true
type: string
- in: query
name: code
description: The code of share
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Attachment'
"400":
description: Invalid page supplied
"404":
description: articl not found
/simwe/simansys:
get:
tags:
- simwe
description: |-
getsimansys
<br>
operationId: SimweController.get simansys
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/simwe/simwe:
get:
tags:
- simwe
description: |-
getsimwe
<br>
operationId: SimweController.get simwe
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/simwe/threejs:
get:
tags:
- simwe
description: |-
getthreejs
<br>
operationId: SimweController.get threejs
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/todo/create:
post:
tags:
- todo
description: |-
post todo
<br>
operationId: TodoController.post todo
parameters:
- in: query
name: name
description: The name for todo
required: true
type: string
- in: query
name: projectid
description: The projectid of todo
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/todo/deletetodo:
post:
tags:
- todo
description: |-
delete tolist
<br>
operationId: TodoController.delete todolist
parameters:
- in: query
name: todoid
description: The id of todo
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/todo/gettodo:
get:
tags:
- todo
description: |-
get tolist
<br>
operationId: TodoController.get todolist
parameters:
- in: query
name: projectid
description: The projectid of todo
required: true
type: string
- in: query
name: page
description: The page of todo
type: string
- in: query
name: limit
description: The size of todo
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/todo/updatetodo:
post:
tags:
- todo
description: |-
update tolist
<br>
operationId: TodoController.update todolist
parameters:
- in: query
name: todoid
description: The id of todo
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/addapplyrecharge/{id}:
post:
tags:
- wx
description: |-
get user recharge
<br>
operationId: PayController.post user recharge
parameters:
- in: path
name: id
description: The id of user
required: true
type: string
- in: query
name: amount
description: The amount of money
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Recharge'
"400":
description: Invalid page supplied
"404":
description: recharge not found
/wx/addbusiness/{id}:
post:
tags:
- wx
description: |-
post business by projectid
<br>
operationId: BusinessController.post business by projectid
parameters:
- in: query
name: content
description: The content of release
required: true
type: string
- in: query
name: location
description: The location of business
required: true
type: string
- in: query
name: lat
description: The lat of location
type: string
- in: query
name: lng
description: The lng of location
type: string
- in: query
name: startDate
description: The startDate of business
type: string
- in: query
name: endDate
description: The endDate of business
type: string
- in: query
name: projecttitle
description: The projecttitle of business
type: string
- in: query
name: drivername
description: The drivername of business
type: string
- in: query
name: subsidy
description: The subsidy of business
type: string
- in: query
name: carfare
description: The carfare of business
type: string
- in: query
name: hotelfee
description: The hotelfee of business
type: string
- in: query
name: users
description: The users of business
type: string
- in: query
name: articleshow
description: The larticleshow of business
type: string
- in: path
name: id
description: The projectid of project
required: true
type: string
- in: query
name: title
description: The title of article
type: string
- in: query
name: articlecontent
description: The content of article
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.CreateBusiness'
"400":
description: Invalid page supplied
"404":
description: pas not found
/wx/addlocationnavigate/{id}:
post:
tags:
- wx
description: |-
post locationnavigate by locationid
<br>
operationId: LocationController.post locationnavigate by locationid
parameters:
- in: query
name: title
description: The title of location
required: true
type: string
- in: query
name: label
description: The label of locationnavigate
required: true
type: string
- in: query
name: location
description: The location of location
required: true
type: string
- in: query
name: address
description: The address of location
type: string
- in: query
name: lat
description: The lat of location
type: string
- in: query
name: lng
description: The lng of location
type: string
- in: path
name: id
description: The locationid of location
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.CreateLocation'
"400":
description: Invalid page supplied
"404":
description: pas not found
/wx/addlocationpart/{id}:
post:
tags:
- wx
description: |-
post location by projectid
<br>
operationId: LocationController.post location by projectid
parameters:
- in: query
name: title
description: The title of locationpart
required: true
type: string
- in: query
name: describe
description: The describe of locationpart
required: true
type: string
- in: query
name: sort
description: The sort of locationpart
required: true
type: string
- in: query
name: userid
description: The userid of location
required: true
type: string
- in: path
name: id
description: The projectid of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.CreateLocation'
"400":
description: Invalid page supplied
"404":
description: pas not found
/wx/adduser:
post:
tags:
- wx
description: |-
add user
<br>
operationId: UserController.add user
parameters:
- in: query
name: username
description: The name of user
type: string
- in: query
name: nickname
description: The nickname of user
type: string
- in: query
name: password
description: The password of user
type: string
- in: query
name: email
description: The email of user
type: string
- in: query
name: department
description: The department of user
type: string
- in: query
name: secoffice
description: The secoffice of user
type: string
- in: query
name: ip
description: The ip of user
type: string
- in: query
name: port
description: The port of user
type: string
- in: query
name: status
description: The status of user
type: string
- in: query
name: role
description: The role of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/adduserpays:
post:
tags:
- wx
description: |-
post userpay by mathtempid
<br>
operationId: PayController.post userpay by mathtempleid
parameters:
- in: query
name: usertempleid
description: The id of mathtemple
required: true
type: string
- in: query
name: amount
description: The amout of pays
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Pay'
"400":
description: Invalid page supplied
"404":
description: pas not found
/wx/adduserrecharge/{id}:
post:
tags:
- wx
description: |-
get users recharge
<br>
operationId: PayController.post users recharge
parameters:
- in: path
name: id
description: The id of recharge
required: true
type: string
- in: query
name: amount
description: The amount of money
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Recharge'
"400":
description: Invalid page supplied
"404":
description: recharge not found
/wx/addwxarticle:
post:
tags:
- wx
description: |-
post article by catalogid
<br>
operationId: ArticleController.post wx artile by catalogId
parameters:
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
- in: query
name: skey
description: The skey of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/addwxarticleflow/{id}:
post:
tags:
- wx
description: |-
post article by catalogid
<br>
operationId: ArticleController.post wx artile by catalogId
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: acid
description: The id of accesscontext
required: true
type: string
- in: query
name: gid
description: The id of group
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/addwxarticles/{id}:
post:
tags:
- wx
description: |-
post article by catalogid
<br>
operationId: ArticleController.post wx artile by catalogId
parameters:
- in: query
name: id
description: The id of project
required: true
type: string
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/addwxattachment:
post:
tags:
- wx
description: |-
post attachment by projectid
<br>
operationId: AttachController.post wx attachment by projectid
parameters:
- in: query
name: pid
description: The projectid of attachment
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: project not found
/wx/addwxdiary:
post:
tags:
- wx
description: |-
post diary by projectid
<br>
operationId: DiaryController.post wx diary by catalogId
parameters:
- in: query
name: projectid
description: The projectid of diary
required: true
type: string
- in: query
name: title
description: The title of diary
required: true
type: string
- in: query
name: diarydate
description: The diarydate of diary
required: true
type: string
- in: query
name: diaryactivity
description: The diaryactivity of diary
required: true
type: string
- in: query
name: diaryweather
description: The diaryweather of diary
required: true
type: string
- in: query
name: content
description: The content of diary
required: true
type: string
- in: query
name: skey
description: The skey of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddDiary'
"400":
description: Invalid page supplied
"404":
description: Diary not found
/wx/addwxeditorarticle:
post:
tags:
- wx
description: |-
post article by catalogid
<br>
operationId: ArticleController.post wx artile by catalogId
parameters:
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
- in: query
name: skey
description: The skey of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/addwxfinance/{id}:
post:
tags:
- wx
description: |-
post finance by projectid
<br>
operationId: FinanceController.post wx finance by catalogId
parameters:
- in: path
name: id
description: The projectid of finance
required: true
type: string
- in: query
name: amount
description: The amount of finance
required: true
type: string
- in: query
name: radio
description: The radio of finance
required: true
type: string
- in: query
name: radio2
description: The radio2 of finance
required: true
type: string
- in: query
name: financedate
description: The financedate of finance
required: true
type: string
- in: query
name: financeactivity
description: The financeactivity of finance
required: true
type: string
- in: query
name: content
description: The content of finance
required: true
type: string
- in: query
name: skey
description: The skey of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddFinance'
"400":
description: Invalid page supplied
"404":
description: Finance not found
/wx/addwxlike/{id}:
post:
tags:
- wx
description: |-
post like by articleId
<br>
operationId: ReplyController.post wx like by articleId
parameters:
- in: query
name: id
description: The id of article
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddTopicLike'
"400":
description: Invalid page supplied
"404":
description: article not found
/wx/addwxrelease/{id}:
post:
tags:
- wx
description: |-
post release by articleId
<br>
operationId: ReplyController.post wx release by articleId
parameters:
- in: query
name: content
description: The content of release
required: true
type: string
- in: query
name: app_version
description: the app_version of wx
type: string
- in: query
name: avatar
description: The avatar of release
type: string
- in: query
name: username
description: The username of release
type: string
- in: query
name: publish_time
description: The time of release
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddTopicReply'
"400":
description: Invalid page supplied
"404":
description: article not found
/wx/addwxuser:
post:
tags:
- wx
description: |-
post user
<br>
operationId: UserController.post wx user
parameters:
- in: query
name: uname
description: The username of user
required: true
type: string
- in: query
name: nickname
description: The nickname of user
required: true
type: string
- in: query
name: password
description: The Password of user
required: true
type: string
- in: query
name: email
description: The email of user
type: string
- in: query
name: department
description: The department of user
type: string
- in: query
name: secoffice
description: The secoffice of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddUser'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/addwxuserpays:
post:
tags:
- wx
description: |-
post userpay by articleid
<br>
operationId: PayController.post wx userpay by articleid
parameters:
- in: query
name: articleid
description: The id of article
required: true
type: string
- in: query
name: amount
description: The amout of pays
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddUserPays'
"400":
description: Invalid page supplied
"404":
description: pas not found
/wx/applyrecharge:
get:
tags:
- wx
description: |-
get user recharge
<br>
operationId: PayController.get user recharge
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Recharge'
"400":
description: Invalid page supplied
"404":
description: recharge not found
/wx/businesscheck:
post:
tags:
- wx
description: |-
post person
<br>
operationId: BusinessController.post checkin person
parameters:
- in: query
name: userid
description: The userid for person
required: true
type: string
- in: query
name: businessid
description: The businessid for Business
required: true
type: string
- in: query
name: lat
description: The businessid of check
required: true
type: string
- in: query
name: lng
description: The businessid of check
required: true
type: string
- in: query
name: photoUrl
description: The photoUrl of check
required: true
type: string
- in: query
name: year
description: The businessid of check
required: true
type: string
- in: query
name: month
description: The businessid of check
required: true
type: string
- in: query
name: day
description: The businessid of check
required: true
type: string
- in: query
name: location
description: The location of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthcheck/{id}:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
parameters:
- in: path
name: id
description: The projectid of business
required: true
type: string
- in: query
name: page
description: The page of check
type: string
- in: query
name: limit
description: The size of check
type: string
- in: query
name: year
description: The year of check
required: true
type: string
- in: query
name: month
description: The month of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthcheck2/{id}:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
parameters:
- in: path
name: id
description: The projectid of business
required: true
type: string
- in: query
name: page
description: The page of check
type: string
- in: query
name: limit
description: The size of check
type: string
- in: query
name: year
description: The year of check
required: true
type: string
- in: query
name: month
description: The month of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthcheck3/{id}:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
parameters:
- in: path
name: id
description: The projectid of business
required: true
type: string
- in: query
name: page
description: The page of check
type: string
- in: query
name: limit
description: The size of check
type: string
- in: query
name: year
description: The year of check
required: true
type: string
- in: query
name: month
description: The month of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthcheck4/{id}:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
parameters:
- in: path
name: id
description: The projectid of business
required: true
type: string
- in: query
name: page
description: The page of check
type: string
- in: query
name: limit
description: The size of check
type: string
- in: query
name: year
description: The year of check
required: true
type: string
- in: query
name: month
description: The month of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthcheck5/{id}:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
parameters:
- in: path
name: id
description: The projectid of business
required: true
type: string
- in: query
name: page
description: The page of check
type: string
- in: query
name: limit
description: The size of check
type: string
- in: query
name: year
description: The year of check
required: true
type: string
- in: query
name: month
description: The month of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthchecksum:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthchecksum2:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthchecksum3:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthchecksum4:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/businessmonthchecksum5:
get:
tags:
- wx
description: |-
get businessmonthcheck
<br>
operationId: BusinessController.get businessmothcheckin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/createphotodata:
get:
tags:
- wx
description: |-
get photo
<br>
operationId: PhotoController.get photo
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: data not found
/wx/deletefreecad:
post:
tags:
- wx
description: |-
post freecad model delete by id...
<br>
operationId: ArticleController.post freecad model delete by id...
parameters:
- in: query
name: id
description: The id of freecad
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.FreeCAD'
"400":
description: Invalid page supplied
"404":
description: data not found
/wx/deleteuser:
post:
tags:
- wx
description: |-
delete user
<br>
operationId: UserController.delete user
parameters:
- in: query
name: ids
description: The ids of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/deletewxarticle:
post:
tags:
- wx
description: |-
post article by catalogid
<br>
operationId: ArticleController.post wx artile by articleId
parameters:
- in: query
name: id
description: The id of article
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/deletewxdiary:
post:
tags:
- wx
description: |-
post diary by catalogid
<br>
operationId: DiaryController.post wx diary by diaryId
parameters:
- in: query
name: id
description: The id of diary
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Adddiary'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/deletewxfinance:
post:
tags:
- wx
description: |-
post finance by catalogid
<br>
operationId: FinanceController.post wx finance by financeId
parameters:
- in: query
name: id
description: The id of finance
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Addfinance'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/deletewxrelease/{id}:
post:
tags:
- wx
description: |-
delete release by releaseid
<br>
operationId: ReplyController.delete wx release by releaseid
parameters:
- in: query
name: id
description: The id of release
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.DeleteTopicReply'
"400":
description: Invalid page supplied
"404":
description: article not found
/wx/downloadstandard/{id}:
get:
tags:
- wx
description: |-
get standardpdf by id
<br>
operationId: StandardController.dowload standardpdf
parameters:
- in: path
name: id
description: The id of standardpdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/elasticsearch:
get:
tags:
- wx
description: |-
get earch
<br>
operationId: StandardController.get search
parameters:
- in: query
name: q
description: The query=
type: string
- in: formData
name: a
description: The after...
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetSearch'
"400":
description: Invalid page supplied
"404":
description: Search not found
/wx/getansysdata/{id}:
get:
tags:
- wx
description: |-
get ansys data by link
<br>
operationId: AttachController.dowload ansys data
parameters:
- in: path
name: id
description: The id of data
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: data not found
/wx/getapplyrecharge:
get:
tags:
- wx
description: |-
get user recharge
<br>
operationId: PayController.get user recharge
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Recharge'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getapplyrechargedata:
get:
tags:
- wx
description: |-
get user rechargedata
<br>
operationId: PayController.get user rechargedata
parameters:
- in: query
name: page
description: The page of recharges
type: string
- in: query
name: limit
description: The size of recharges
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Recharge'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getbusiness/{id}:
get:
tags:
- wx
description: |-
get business by userid
<br>
operationId: BusinessController.get business by userid
parameters:
- in: path
name: id
description: The projectid of business
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetBusinessPage'
"400":
description: Invalid page supplied
"404":
description: business not found
/wx/getbusinessbyid/{id}:
get:
tags:
- wx
description: |-
get business by businessid
<br>
operationId: BusinessController.get business by businessid
parameters:
- in: path
name: id
description: The id of business
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetBusinessPage'
"400":
description: Invalid page supplied
"404":
description: business not found
/wx/getbusinesscheck:
get:
tags:
- wx
description: |-
get check
<br>
operationId: BusinessController.get checkin check
parameters:
- in: query
name: userid
description: The userId of check
required: true
type: string
- in: query
name: businessid
description: The activityId of check
required: true
type: string
- in: query
name: year
description: The year of check
required: true
type: string
- in: query
name: month
description: The month of check
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getelasticstandard:
get:
tags:
- wx
description: |-
get standard elasticsearch web
<br>
operationId: StandardController.get standard elasticsearch web
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetElastic'
"400":
description: Invalid page supplied
"404":
description: Elastic not found
/wx/getexcelpdf/{id}:
get:
tags:
- wx
description: |-
get math pdf by link
<br>
operationId: AttachController.dowload math pdf
parameters:
- in: path
name: id
description: The id of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getlistarticles:
get:
tags:
- wx
description: |-
get articles by page
<br>
operationId: ArticleController.get wx artiles list
parameters:
- in: query
name: page
description: The page for articles list
required: true
type: string
- in: query
name: limit
description: The limit of page for articles list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getlocation/{id}:
get:
tags:
- wx
description: |-
get location by userid
<br>
operationId: LocationController.get location by userid
parameters:
- in: path
name: id
description: The projectid of location
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetLocationPage'
"400":
description: Invalid page supplied
"404":
description: location not found
/wx/getlocationbyid/{id}:
get:
tags:
- wx
description: |-
get location by locationid
<br>
operationId: LocationController.get location by locationid
parameters:
- in: path
name: id
description: The id of location
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetLocationPage'
"400":
description: Invalid page supplied
"404":
description: location not found
/wx/getmathpdf/{id}:
get:
tags:
- wx
description: |-
get math pdf by link
<br>
operationId: AttachController.dowload math pdf
parameters:
- in: path
name: id
description: The id of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getmonthphotodata:
get:
tags:
- wx
description: |-
get photo
<br>
operationId: PhotoController.get photo
parameters:
- in: query
name: page
description: The page for photos list
required: true
type: string
- in: query
name: limit
description: The limit of page for photos list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getmonthvideodata:
get:
tags:
- wx
description: |-
get video
<br>
operationId: VideoController.get video
parameters:
- in: query
name: page
description: The page for video list
required: true
type: string
- in: query
name: limit
description: The limit of page for video list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getpay:
get:
tags:
- wx
description: |-
get pay by page
<br>
operationId: PayController.get pay list
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getpostdata:
post:
tags:
- wx
description: |-
get Data(file)
<br>
operationId: MainController.get Data(file)
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getuserarticle:
get:
tags:
- wx
description: |-
get userarticles by projid
<br>
operationId: ArticleController.get wx userarticles count
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articles not found
/wx/getuserbyusername:
get:
tags:
- wx
description: |-
get user
<br>
operationId: UserController.get user
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/getusergetappreciations:
get:
tags:
- wx
description: |-
get userappreciation by page
<br>
operationId: PayController.get userappreciations list
parameters:
- in: query
name: page
description: The page for myappreciation list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getusermoney:
get:
tags:
- wx
description: |-
get user money
<br>
operationId: PayController.get user money
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetUserMoney'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getuserpay:
get:
tags:
- wx
description: |-
get pay by page
<br>
operationId: PayController.get pay list
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getuserpayappreciations:
get:
tags:
- wx
description: |-
get userappreciation by page
<br>
operationId: PayController.get userappreciations list
parameters:
- in: query
name: page
description: The page for myappreciation list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getuserpaylist:
get:
tags:
- wx
description: |-
get userpay by page
<br>
operationId: PayController.get userpays list
parameters:
- in: query
name: page
description: The page for mymoney list
required: true
type: string
- in: query
name: page
description: The page of projproducts
type: string
- in: query
name: limit
description: The size of page
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getuservideo/{id}:
get:
tags:
- wx
description: |-
get uservideolist
<br>
operationId: VideoController.get uservideolist
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
- in: query
name: searchText
description: The searchText of uservideo
type: string
- in: query
name: pageNo
description: The page for uservideo list
required: true
type: string
- in: query
name: limit
description: The limit of page for uservideo list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Create'
"400":
description: Invalid page supplied
"404":
description: cart not found
/wx/getwxarticle/{id}:
get:
tags:
- wx
description: |-
get article by articleid
<br>
operationId: ArticleController.get wx artile by articleId
parameters:
- in: path
name: id
description: The id of article
required: true
type: string
- in: query
name: skey
description: The skey of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/getwxarticleflow/{id}:
get:
tags:
- wx
description: |-
get article by articleid
<br>
operationId: ArticleController.get wx artile by articleId
parameters:
- in: path
name: id
description: The id of article
required: true
type: string
- in: query
name: skey
description: The skey of user
required: true
type: string
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: docid
description: The id of doc
required: true
type: string
- in: query
name: acid
description: The id of accesscontext
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/getwxarticles:
get:
tags:
- wx
description: |-
get articles by page
<br>
operationId: ArticleController.get wx artiles list
parameters:
- in: query
name: page
description: The page for articles list
required: true
type: string
- in: query
name: limit
description: The limit of page for articles list
required: true
type: string
- in: query
name: skey
description: The skey for user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxarticless/{id}:
get:
tags:
- wx
description: |-
get articles by page
<br>
operationId: ArticleController.get wx artiles list
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
- in: query
name: page
description: The page for articles list
required: true
type: string
- in: query
name: limit
description: The limit of page for articles list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxarticletype:
get:
tags:
- wx
description: |-
get articles by page
<br>
operationId: ArticleController.get wx artiles list
parameters:
- in: query
name: page
description: The page for articles list
required: true
type: string
- in: query
name: limit
description: The limit of page for articles list
required: true
type: string
- in: query
name: dsid
description: The id of docstate
required: true
type: string
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: acid
description: The id of accesscontext
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxdiaries:
get:
tags:
- wx
description: |-
get diaries to doc
<br>
operationId: DiaryController.get wx diaries to doc
parameters:
- in: query
name: projectid
description: The projectid of diary
required: true
type: string
- in: query
name: page
description: The page for diaries list
required: true
type: string
- in: query
name: limit
description: The limit of page for diaries list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxdiaries2/{id}:
get:
tags:
- wx
description: |-
get diaries by page
<br>
operationId: DiaryController.get wx diaries list
parameters:
- in: path
name: id
description: The id of diaries
required: true
type: string
- in: query
name: page
description: The page for diaries list
required: true
type: string
- in: query
name: limit
description: The limit of page for diaries list
required: true
type: string
- in: query
name: skey
description: The skey for diary
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxdiary/{id}:
get:
tags:
- wx
description: |-
get diary by diaryid
<br>
operationId: DiaryController.get wx diary by diaryId
parameters:
- in: path
name: id
description: The id of diary
required: true
type: string
- in: path
name: skey
description: The skey of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetDiary'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/getwxexcelpdf/{id}:
get:
tags:
- wx
description: |-
get wx pdf by link
<br>
operationId: AttachController.dowload wx pdf
parameters:
- in: query
name: id
description: The url of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getwxexceltemppdf/{id}:
get:
tags:
- wx
description: |-
get wx math temp pdf by id
<br>
operationId: AttachController.dowload wx math temp pdf
parameters:
- in: path
name: id
description: The url of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getwxfinance/{id}:
get:
tags:
- wx
description: |-
get finance by financeid
<br>
operationId: FinanceController.get wx finance by financeId
parameters:
- in: path
name: id
description: The id of finance
required: true
type: string
- in: path
name: skey
description: The skey of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetFinance'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/getwxfinance2/{id}:
get:
tags:
- wx
description: |-
get finance by page
<br>
operationId: FinanceController.get wx finance list
parameters:
- in: path
name: id
description: The projectid of finance
required: true
type: string
- in: query
name: page
description: The page for finance list
required: true
type: string
- in: query
name: limit
description: The limit of page for finance list
required: true
type: string
- in: query
name: skey
description: The skey for finance
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxfinancelist/{id}:
get:
tags:
- wx
description: |-
get finance by page
<br>
operationId: FinanceController.get wx finance list
parameters:
- in: path
name: id
description: The projectid of finance
required: true
type: string
- in: query
name: page
description: The page for finance list
required: true
type: string
- in: query
name: limit
description: The limit of page for finance list
required: true
type: string
- in: query
name: skey
description: The skey for finance
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxmathpdf/{id}:
get:
tags:
- wx
description: |-
get wx pdf by link
<br>
operationId: AttachController.dowload wx pdf
parameters:
- in: path
name: id
description: The id of history pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getwxpay:
get:
tags:
- wx
description: |-
get pay by page
<br>
operationId: PayController.get wx pay list
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxpdf/{id}:
get:
tags:
- wx
description: |-
get wx pdf by id
<br>
operationId: AttachController.dowload wx pdf
parameters:
- in: path
name: id
description: The id of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getwxpdflist:
get:
tags:
- wx
description: |-
get pdf by page
<br>
operationId: SearchController.get wx pdf list
parameters:
- in: query
name: keyword
description: The keyword of pdf
type: string
- in: query
name: projectid
description: The projectid of pdf
type: string
- in: query
name: searchpage
description: The page for pdf list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getwxtemppdf/{id}:
get:
tags:
- wx
description: |-
get wx math temp pdf by id
<br>
operationId: AttachController.dowload wx math temp pdf
parameters:
- in: path
name: id
description: The url of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/getwxuserarticles/{id}:
get:
tags:
- wx
description: |-
get userarticles by projid
<br>
operationId: ArticleController.get wx userarticles count
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articles not found
/wx/getwxusergetappreciations:
get:
tags:
- wx
description: |-
get userappreciation by page
<br>
operationId: PayController.get wx userappreciations list
parameters:
- in: query
name: page
description: The page for myappreciation list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxusermoney:
get:
tags:
- wx
description: |-
get user money
<br>
operationId: PayController.get wx user money
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetUserMoney'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxuserpayappreciations:
get:
tags:
- wx
description: |-
get userappreciation by page
<br>
operationId: PayController.get wx userappreciations list
parameters:
- in: query
name: page
description: The page for myappreciation list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/getwxuserpays:
get:
tags:
- wx
description: |-
get userpay by page
<br>
operationId: PayController.get wx userpays list
parameters:
- in: query
name: page
description: The page for mymoney list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/importusers:
post:
tags:
- wx
description: |-
import users
<br>
operationId: UserController.import users
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/islogin:
get:
tags:
- wx
description: |-
get login..
<br>
operationId: LoginController.get user login...
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/wx/loginpost:
post:
tags:
- wx
description: |-
post login..
<br>
operationId: LoginController.post user login...
parameters:
- in: query
name: uname
description: The name of user
required: true
type: string
- in: query
name: pwd
description: The password of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/wx/passlogin:
get:
tags:
- wx
description: |-
post wx login
<br>
operationId: LoginController.post wx login
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/photo:
get:
tags:
- wx
description: |-
get photo
<br>
operationId: PhotoController.get photo
parameters:
- in: query
name: page
description: The page for photos list
required: true
type: string
- in: query
name: limit
description: The limit of page for photos list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/photo_back:
get:
tags:
- wx
description: |-
get photo
<br>
operationId: PhotoController.get photo
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/photodetail:
get:
tags:
- wx
description: |-
get photodetail
<br>
operationId: PhotoController.get photodetail
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/photodetail_back:
get:
tags:
- wx
description: |-
get photodetail
<br>
operationId: PhotoController.get photodetail
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/postdata:
post:
tags:
- wx
description: |-
post Data(file)
<br>
operationId: MainController.post Data(file)
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: articls not found
/wx/project/product/pdf/{id}:
get:
tags:
- wx
description: |-
get documentdetail
<br>
operationId: AttachController.get wf document details
parameters:
- in: query
name: dtid
description: The id of doctype
required: true
type: string
- in: query
name: docid
description: The id of doc
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: data not found
/wx/search:
get:
tags:
- wx
description: |-
get standardlist
<br>
operationId: StandardController.get standardlist
parameters:
- in: query
name: searchText
description: The searchText of standard
type: string
- in: query
name: pageNo
description: The page for standard list
required: true
type: string
- in: query
name: limit
description: The limit of page for standard list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Create'
"400":
description: Invalid page supplied
"404":
description: cart not found
/wx/searchproduct:
get:
tags:
- wx
description: |-
get allprojects' products search by page
<br>
operationId: SearchController.get allprojects' products search list
parameters:
- in: query
name: keyword
description: The keyword of products
type: string
- in: query
name: productid
description: The id of project
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: drawings not found
/wx/searchproductdata:
get:
tags:
- wx
description: |-
get all projects' products search by page
<br>
operationId: SearchController.get all projects' products search list
parameters:
- in: query
name: keyword
description: The keyword of products
required: true
type: string
- in: query
name: searchText
description: The searchText of products
type: string
- in: query
name: limit
description: The limit of products list
type: string
- in: query
name: pageNo
description: The page of products list
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: drawings not found
/wx/searchprojectproduct:
get:
tags:
- wx
description: |-
get a project's products search by page
<br>
operationId: SearchController.get a project's products search list
parameters:
- in: query
name: keyword
description: The keyword of products
type: string
- in: query
name: productid
description: The id of project
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: drawings not found
/wx/searchprojectproductdata:
get:
tags:
- wx
description: |-
get a project's products search by page
<br>
operationId: SearchController.get a project's products search list
parameters:
- in: query
name: keyword
description: The keyword of products
required: true
type: string
- in: query
name: searchText
description: The searchText of products
type: string
- in: query
name: pageNo
description: The pageNo of drawings list
type: string
- in: query
name: limit
description: The limit of products list
type: string
- in: query
name: productid
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: drawings not found
/wx/searchwxarticles/{id}:
get:
tags:
- wx
description: |-
get articles by page
<br>
operationId: ArticleController.get wx articles list
parameters:
- in: path
name: id
description: The projectid of article
required: true
type: string
- in: query
name: keyword
description: The keyword of article
required: true
type: string
- in: query
name: limit
description: The limit for articles list
required: true
type: string
- in: query
name: searchpage
description: The page for articles list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetArticles'
"400":
description: Invalid page supplied
"404":
description: articles not found
/wx/searchwxdrawings:
get:
tags:
- wx
description: |-
get drawings by page
<br>
operationId: SearchController.get wx drawings list
parameters:
- in: query
name: keyword
description: The keyword of drawings
type: string
- in: query
name: projectid
description: The projectid of drawings
type: string
- in: query
name: searchpage
description: The page for drawings list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: drawings not found
/wx/searchwxproducts:
get:
tags:
- wx
description: |-
get drawings by page
<br>
operationId: SearchController.get wx drawings list
parameters:
- in: query
name: keyword
description: The keyword of drawings
type: string
- in: query
name: projectid
description: The projectid of drawings
type: string
- in: query
name: searchpage
description: The page for drawings list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: drawings not found
/wx/searchwxstandards:
get:
tags:
- wx
description: |-
get standards by page
<br>
operationId: StandardController.get wx standards list
parameters:
- in: query
name: keyword
description: The keyword of standards
required: true
type: string
- in: query
name: searchpage
description: The page for drawings list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetProductsPage'
"400":
description: Invalid page supplied
"404":
description: standards not found
/wx/ssologin:
get:
tags:
- wx
description: |-
get sso logintpl
<br>
operationId: LoginController.get sso login
parameters:
- in: query
name: service
description: The service of login
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/ssologinpost:
post:
tags:
- wx
description: |-
post sso login data
<br>
operationId: LoginController.post sso login
parameters:
- in: query
name: login_name
description: The name of user
required: true
type: string
- in: query
name: password
description: The password of user
required: true
type: string
- in: query
name: service
description: The service of location.href
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/standard/upload:
post:
tags:
- wx
description: |-
post file by BootstrapFileInput
<br>
operationId: StandardController.post bootstrapfileinput
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: page not found
/wx/standardpdf:
get:
tags:
- wx
description: |-
get standardpdf
<br>
operationId: StandardController.get standardpdf
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Create'
"400":
description: Invalid page supplied
"404":
description: cart not found
/wx/tianditusearch:
get:
tags:
- wx
description: |-
get tianditu location
<br>
operationId: SupaMapusController.get tianditu location
parameters:
- in: query
name: keyword
description: The keyword of location
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: location not found
/wx/updatebusiness/{id}:
post:
tags:
- wx
description: |-
post update business by businessid
<br>
operationId: BusinessController.post update business by businessid
parameters:
- in: query
name: content
description: The content of release
required: true
type: string
- in: query
name: location
description: The location of business
required: true
type: string
- in: query
name: lat
description: The lat of location
type: string
- in: query
name: lng
description: The lng of location
type: string
- in: query
name: startDate
description: The startDate of business
type: string
- in: query
name: endDate
description: The endDate of business
type: string
- in: query
name: projecttitle
description: The projecttitle of business
type: string
- in: query
name: drivername
description: The drivername of business
type: string
- in: query
name: subsidy
description: The subsidy of business
type: string
- in: query
name: carfare
description: The carfare of business
type: string
- in: query
name: hotelfee
description: The hotelfee of business
type: string
- in: query
name: users
description: The users of business
type: string
- in: query
name: articleshow
description: The larticleshow of business
type: string
- in: path
name: id
description: The id of business
required: true
type: string
- in: query
name: title
description: The title of article
type: string
- in: query
name: articlecontent
description: The content of article
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UpdateBusiness'
"400":
description: Invalid page supplied
"404":
description: pas not found
/wx/updateuser:
post:
tags:
- wx
description: |-
add user
<br>
operationId: UserController.update user
parameters:
- in: query
name: pk
description: The pk of user
type: string
- in: query
name: name
description: The name of user
type: string
- in: query
name: value
description: The value of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/updatewxdiary:
post:
tags:
- wx
description: |-
post diary by diaryid
<br>
operationId: DiaryController.post wx diary by diaryid
parameters:
- in: query
name: id
description: The id of diary
required: true
type: string
- in: query
name: title
description: The title of diary
required: true
type: string
- in: query
name: content
description: The content of diary
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddDiary'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/updatewxeditorarticle:
post:
tags:
- wx
description: |-
post article by articleid
<br>
operationId: ArticleController.post wx artile by articleid
parameters:
- in: query
name: id
description: The id of article
required: true
type: string
- in: query
name: title
description: The title of article
required: true
type: string
- in: query
name: content
description: The content of article
required: true
type: string
- in: query
name: skey
description: The skey of user
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/updatewxfinance:
post:
tags:
- wx
description: |-
post finance by financeid
<br>
operationId: FinanceController.post wx finance by financeid
parameters:
- in: query
name: id
description: The id of finance
required: true
type: string
- in: query
name: amount
description: The amount of finance
required: true
type: string
- in: query
name: radio
description: The radio of finance
required: true
type: string
- in: query
name: radio2
description: The radio2 of finance
required: true
type: string
- in: query
name: financedate
description: The financedate of finance
required: true
type: string
- in: query
name: content
description: The content of finance
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddFinance'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/updatewxuser:
post:
tags:
- wx
description: |-
post user password by uid
<br>
operationId: UserController.post wx userpassword by uid
parameters:
- in: query
name: uid
description: The id of user
required: true
type: string
- in: query
name: oldpass
description: The oldPassword of user
required: true
type: string
- in: query
name: newpass
description: The newpassword of user
required: true
type: string
- in: path
name: id
description: The id of wx
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.AddArticle'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/uploadappreciationphoto:
post:
tags:
- wx
description: |-
post user avatar
<br>
operationId: FroalaController.post wx user avatar
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/uploadphoto:
get:
tags:
- wx
description: |-
upload photo page
<br>
operationId: PhotoController.upload photo page
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.UploadPhoto'
"400":
description: Invalid page supplied
"404":
description: photo not found
/wx/uploadphotodata:
post:
tags:
- wx
description: |-
post photo
<br>
operationId: PhotoController.post wx photo img
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: photo not found
/wx/uploadstandard:
get:
tags:
- wx
description: |-
get upload file to standard html
<br>
operationId: StandardController.upload file to standard html
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.Elastic'
"400":
description: Invalid page supplied
"404":
description: Page not found
/wx/uploadvideo:
get:
tags:
- wx
description: |-
upload video
<br>
operationId: VideoController.upload videodata
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.uploadvideo'
"400":
description: Invalid page supplied
"404":
description: video not found
/wx/uploadvideodata:
post:
tags:
- wx
description: |-
post video
<br>
operationId: PhotoController.post wx video data
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: video not found
/wx/uploadwxavatar:
post:
tags:
- wx
description: |-
post user avatar
<br>
operationId: FroalaController.post wx user avatar
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/uploadwxeditorimg:
post:
tags:
- wx
description: |-
post article img by catalogid
<br>
operationId: FroalaController.post wx artile img by catalogId
parameters:
- in: query
name: projectid
description: The projectid of wxeditor
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/uploadwximg:
post:
tags:
- wx
description: |-
post article img by catalogid
<br>
operationId: FroalaController.post wx artile img by catalogId
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/uploadwximgs/{id}:
post:
tags:
- wx
description: |-
post article img by catalogid
<br>
operationId: FroalaController.post wx artile img by catalogId
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/uploadwxvideo/{id}:
post:
tags:
- wx
description: |-
post video by catalogid
<br>
operationId: FroalaController.post wx video by catalogId
parameters:
- in: path
name: id
description: The id of project
required: true
type: string
- in: query
name: desc
description: The descript of video
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: article not found
/wx/uploadwxvideocover/{id}:
post:
tags:
- wx
description: |-
post videoCover by videoid
<br>
operationId: FroalaController.post wx videoCover by videoid
parameters:
- in: query
name: id
description: The id of video
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/SUCCESS'
"400":
description: Invalid page supplied
"404":
description: article not found
/wx/user/{id}:
get:
tags:
- wx
description: |-
get user by userid
<br>
operationId: UserController.get user
parameters:
- in: path
name: id
description: The id of user
type: string
- in: query
name: role
description: The role of user
type: string
- in: query
name: searchText
description: The searchText of users
type: string
- in: query
name: pageNo
description: The page for users list
required: true
type: string
- in: query
name: limit
description: The limit of page for users list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/usermanage:
get:
tags:
- wx
description: |-
get usermanage
<br>
operationId: MainController.get usermanage
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/usermyself:
get:
tags:
- wx
description: |-
get usermyself
<br>
operationId: UserController.get usermyself
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.User'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/video:
get:
tags:
- wx
description: |-
get video
<br>
operationId: VideoController.get video
parameters:
- in: query
name: page
description: The page for photos list
required: true
type: string
- in: query
name: limit
description: The limit of page for photos list
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/videodetail/{id}:
get:
tags:
- wx
description: |-
get videodetail
<br>
operationId: VideoController.get videodetail
parameters:
- in: path
name: id
description: The id of video
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetVideobyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/wxelasticsearch:
get:
tags:
- wx
description: |-
get earch
<br>
operationId: StandardController.get search
parameters:
- in: query
name: keyword
description: The query=
type: string
- in: formData
name: search_after
description: paginated by earchpage...
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetSearch'
"400":
description: Invalid page supplied
"404":
description: Search not found
/wx/wxhassession:
get:
tags:
- wx
description: |-
get wx usersession
<br>
operationId: LoginController.get wx haslogin
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/wxlogin/{id}:
get:
tags:
- wx
description: |-
post wx login
<br>
operationId: LoginController.post wx login
parameters:
- in: path
name: id
description: The id of wx
required: true
type: string
- in: path
name: code
description: The jscode of wxuser
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/success'
"400":
description: Invalid page supplied
"404":
description: articl not found
/wx/wxpdf/{id}:
get:
tags:
- wx
description: |-
get wx pdf by id
<br>
operationId: AttachController.dowload wx pdf
parameters:
- in: path
name: id
description: The id of pdf
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
/wx/wxregion:
post:
tags:
- wx
description: |-
post wx region
<br>
operationId: RegistController.post wx region
parameters:
- in: query
name: uname
description: The username of user
required: true
type: string
- in: query
name: password
description: The password of account
required: true
type: string
- in: query
name: code
description: The code of wx
required: true
type: string
- in: query
name: app_version
description: The app_version of wx
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.SaveUser'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/wxregist:
post:
tags:
- wx
description: |-
post wx regist
<br>
operationId: RegistController.post wx regist
parameters:
- in: query
name: uname
description: The username of ueser
required: true
type: string
- in: query
name: password
description: The password of account
required: true
type: string
- in: query
name: code
description: The code of wx
required: true
type: string
- in: query
name: app_version
description: The app_version of wx
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.SaveUser'
"400":
description: Invalid page supplied
"404":
description: user not found
/wx/wxstandardpdf/{id}:
get:
tags:
- wx
description: |-
get wx standardpdf by id
<br>
operationId: StandardController.dowload wx standardpdf
parameters:
- in: path
name: id
description: The id of standardpdf
required: true
type: string
- in: query
name: token
description: The token of user
required: true
type: string
responses:
"200":
description: ""
schema:
$ref: '#/definitions/models.GetAttachbyId'
"400":
description: Invalid page supplied
"404":
description: pdf not found
definitions:
FileNode1:
title: FileNode1
type: object
SUCCESS:
title: SUCCESS
type: object
gorm.DeletedAt:
title: DeletedAt
type: object
gorm.Model:
title: Model
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
models.AddAdminCategory:
title: AddAdminCategory
type: object
models.AddArticle:
title: AddArticle
type: object
models.AddDiary:
title: AddDiary
type: object
models.AddFinance:
title: AddFinance
type: object
models.AddSignature:
title: AddSignature
type: object
models.AddTopicLike:
title: AddTopicLike
type: object
models.AddTopicReply:
title: AddTopicReply
type: object
models.AddUser:
title: AddUser
type: object
models.AddUserPays:
title: AddUserPays
type: object
models.Adddiary:
title: Adddiary
type: object
models.Addfinance:
title: Addfinance
type: object
models.AnsysApdl:
title: AnsysApdl
type: object
properties:
Article:
$ref: '#/definitions/models.AnsysArticle'
ClassID:
type: integer
format: int64
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
description: foreignkey:UserID;references:Id;
UserID:
description: (), tag `index`
type: integer
format: int64
path:
type: string
status:
type: boolean
title:
type: string
titleb:
description:
type: string
version:
type: string
models.AnsysArticle:
title: AnsysArticle
type: object
properties:
AnsysApdlID:
type: integer
format: int32
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
html:
type: string
subtext:
type: string
title:
type: string
models.AnsysHistory:
title: AnsysHistory
type: object
properties:
AnsysApdl:
$ref: '#/definitions/models.AnsysApdl'
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
description:
ansysapdlid:
type: integer
format: int32
pdfurl:
type: string
userid:
description: (), tag `index`
type: integer
format: int64
models.AnsysHistoryInputValue:
title: AnsysHistoryInputValue
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
ansys_history_id:
type: integer
format: int32
ansysinputsid:
type: integer
format: int32
inputvalue:
type: string
models.AnsysHistoryOutputValue:
title: AnsysHistoryOutputValue
type: object
properties:
AnsysHistoryID:
type: integer
format: int32
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
ansysoutputsid:
type: integer
format: int32
outputvalue:
type: string
models.AnsysImage:
title: AnsysImage
type: object
properties:
Url:
type: string
models.AnsysInputs:
title: AnsysInputs
type: object
properties:
AnsysApdlID:
type: integer
format: int32
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
comment:
type: string
historyinputvalue:
$ref: '#/definitions/models.AnsysHistoryInputValue'
inputalias:
type: string
inputvalue:
description:
type: string
realmax:
type: string
realmin:
type: string
resulttype:
type: string
selectvalue:
type: array
items:
$ref: '#/definitions/models.Select2'
textarealvalue:
$ref: '#/definitions/models.TextAreal'
units:
type: string
models.AnsysOutputs:
title: AnsysOutputs
type: object
properties:
AnsysApdlID:
type: integer
format: int32
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
comment:
type: string
historyoutputvalue:
$ref: '#/definitions/models.AnsysHistoryOutputValue'
images:
type: array
items:
$ref: '#/definitions/models.AnsysImage'
outputalias:
type: string
outputvalue:
type: string
resulttype:
type: string
units:
type: string
models.Article:
title: Article
type: object
properties:
Created:
type: string
format: datetime
ProductId:
type: integer
format: int64
Subtext:
type: string
Updated:
type: string
format: datetime
Views:
type: integer
format: int64
html:
type: string
id:
type: integer
format: int64
models.Attachment:
title: Attachment
type: object
properties:
Created:
type: string
format: datetime
Downloads:
type: integer
format: int64
FileName:
type: string
FileSize:
type: integer
format: int64
Id:
type: integer
format: int64
ProductId:
type: integer
format: int64
Updated:
type: string
format: datetime
models.Avatar:
title: Avatar
type: object
models.Create:
title: Create
type: object
models.CreateBusiness:
title: CreateBusiness
type: object
models.CreateLocation:
title: CreateLocation
type: object
models.DeleteTopicReply:
title: DeleteTopicReply
type: object
models.DeleteUserCart:
title: DeleteUserCart
type: object
models.Elastic:
title: Elastic
type: object
models.ExcelArticle:
title: ExcelArticle
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ExcelTempleID:
type: integer
format: int32
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
html:
type: string
subtext:
type: string
title:
type: string
models.ExcelHistory:
title: ExcelHistory
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ExcelTemple:
$ref: '#/definitions/models.ExcelTemple'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
description:
type: string
exceltempleid:
type: integer
format: int32
faceimgurl:
type: string
pdfurl:
type: string
userid:
description: (), tag `index`
type: integer
format: int64
models.ExcelHistoryInputValue:
title: ExcelHistoryInputValue
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
excel_history_id:
type: integer
format: int32
excelinputsid:
type: integer
format: int32
inputvalue:
type: string
models.ExcelHistoryOutputValue:
title: ExcelHistoryOutputValue
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ExcelHistoryID:
description: (), tag `index`
type: integer
format: int32
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
exceloutputsid:
type: integer
format: int32
outputvalue:
type: string
models.ExcelImage:
title: ExcelImage
type: object
properties:
Url:
type: string
models.ExcelInputs:
title: ExcelInputs
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ExcelTempleID:
type: integer
format: int32
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
comment:
type: string
historyinputvalue:
$ref: '#/definitions/models.ExcelHistoryInputValue'
inputalias:
type: string
inputvalue:
description:
type: string
realmax:
type: string
realmin:
type: string
resulttype:
type: string
selectvalue:
type: array
items:
$ref: '#/definitions/models.Select2'
textarealvalue:
$ref: '#/definitions/models.TextAreal'
units:
type: string
models.ExcelOutputs:
title: ExcelOutputs
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ExcelTempleID:
type: integer
format: int32
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
comment:
type: string
historyoutputvalue:
$ref: '#/definitions/models.ExcelHistoryOutputValue'
images:
type: array
items:
$ref: '#/definitions/models.ExcelImage'
outputalias:
type: string
outputvalue:
type: string
resulttype:
type: string
units:
type: string
models.ExcelTemple:
title: ExcelTemple
type: object
properties:
Article:
$ref: '#/definitions/models.ExcelArticle'
ClassID:
type: integer
format: int64
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
UserID:
description: (), tag `index`
type: integer
format: int64
number:
type: string
path:
type: string
status:
type: boolean
title:
type: string
titleb:
description:
type: string
version:
type: string
models.FCHistoryInputValue:
title: FCHistoryInputValue
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
fcuserhistoryid:
type: integer
format: int32
freecadinputsid:
type: integer
format: int32
inputvalue:
type: string
models.FreeCAD:
title: FreeCAD
type: object
models.FreecadInputs:
title: FreecadInputs
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
FreecadModelID:
type: integer
format: int32
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
fchistoryinputvalue:
$ref: '#/definitions/models.FCHistoryInputValue'
description: ,UserHistoryID
inputvalue:
description:
type: number
format: double
name:
type: string
number:
type: integer
format: int64
realmax:
type: string
realmin:
type: string
remark:
type: string
selectvalue:
description: hasMany
type: array
items:
$ref: '#/definitions/models.Select2'
symbol:
type: string
textarealvalue:
$ref: '#/definitions/models.TextAreal'
unit:
type: string
models.FreecadModel:
title: FreecadModel
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
UserID:
description: (), tag `index`
type: integer
format: int64
description:
type: string
dxfsvgurl:
type: string
faceimgurl:
type: string
indicators:
type: string
number:
type: string
path:
type: string
renderimgurl:
type: string
status:
type: boolean
title:
type: string
titleb:
description:
type: string
version:
type: string
models.GetAdminCategory:
title: GetAdminCategory
type: object
models.GetAllProjCalendar:
title: GetAllProjCalendar
type: object
models.GetArticle:
title: GetArticle
type: object
models.GetArticles:
title: GetArticles
type: object
models.GetAttachbyId:
title: GetAttachbyId
type: object
models.GetBusinessPage:
title: GetBusinessPage
type: object
models.GetChat:
title: GetChat
type: object
models.GetDiary:
title: GetDiary
type: object
models.GetElastic:
title: GetElastic
type: object
models.GetEstimateCostArchi:
title: GetEstimateCostArchi
type: object
models.GetEstimateCostElect:
title: GetEstimateCostElect
type: object
models.GetEstimateCostMetal:
title: GetEstimateCostMetal
type: object
models.GetEstimateCostTemp:
title: GetEstimateCostTemp
type: object
models.GetEstimateProjects:
title: GetEstimateProjects
type: object
models.GetEstimateProjectsData:
title: GetEstimateProjectsData
type: object
models.GetExcel:
title: GetExcel
type: object
models.GetFinance:
title: GetFinance
type: object
models.GetLocationPage:
title: GetLocationPage
type: object
models.GetMathcadPage:
title: GetMathcadPage
type: object
models.GetOnlyPdf:
title: GetOnlyPdf
type: object
models.GetOnlyoffice:
title: GetOnlyoffice
type: object
models.GetProductsPage:
title: GetProductsPage
type: object
models.GetProjectPage:
title: GetProjectPage
type: object
models.GetSearch:
title: GetSearch
type: object
models.GetUserMoney:
title: GetUserMoney
type: object
models.GetVideobyId:
title: GetVideobyId
type: object
models.GetansysPage:
title: GetansysPage
type: object
models.GetexcelPage:
title: GetexcelPage
type: object
models.Gettest:
title: Gettest
type: object
models.HistoryInputValue:
title: HistoryInputValue
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
inputvalue:
type: string
templeinputsid:
type: integer
format: int32
user_history_id:
type: integer
format: int32
models.HistoryOutputValue:
title: HistoryOutputValue
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
UserHistoryID:
description: (), tag `index`
type: integer
format: int32
outputvalue:
type: string
templeoutputsid:
type: integer
format: int32
models.MathArticle:
title: MathArticle
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
UserTempleID:
type: integer
format: int32
html:
type: string
subtext:
type: string
title:
type: string
models.MathPdf:
title: MathPdf
type: object
models.Ollyoffice:
title: Ollyoffice
type: object
models.Onlyoffice:
title: Onlyoffice
type: object
models.Pay:
title: Pay
type: object
properties:
Amount:
type: integer
format: int64
Article:
$ref: '#/definitions/models.Article'
ArticleID:
type: integer
format: int64
CreatedAt:
type: string
format: datetime
DeletedAt:
type: string
format: datetime
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
description: UserID?column:user_id
User2ID:
type: integer
format: int64
UserID:
description: One-To-One ( - BillingAddressID
type: integer
format: int64
id:
type: integer
format: int32
models.Recharge:
title: Recharge
type: object
properties:
Amount:
type: integer
format: int64
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
UserID:
description: (), tag `index`
type: integer
format: int64
models.SaveUser:
title: SaveUser
type: object
models.Select2:
title: Select2
type: object
properties:
id:
description: tablestring
type: string
text:
type: string
value:
type: string
models.TempleInputs:
title: TempleInputs
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
UserTempleID:
type: integer
format: int32
comment:
type: string
historyinputvalue:
$ref: '#/definitions/models.HistoryInputValue'
description: ,UserHistoryID
inputalias:
type: string
inputvalue:
description:
type: string
realmax:
type: string
realmin:
type: string
resulttype:
type: string
selectvalue:
description: hasMany
type: array
items:
$ref: '#/definitions/models.Select2'
textarealvalue:
$ref: '#/definitions/models.TextAreal'
units:
type: string
models.TempleOutputs:
title: TempleOutputs
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
UserTempleID:
type: integer
format: int32
comment:
type: string
historyoutputvalue:
$ref: '#/definitions/models.HistoryOutputValue'
outputalias:
type: string
outputvalue:
type: string
resulttype:
type: string
units:
type: string
models.TextAreal:
title: TextAreal
type: object
properties:
value:
type: string
models.Update:
title: Update
type: object
models.UpdateBusiness:
title: UpdateBusiness
type: object
models.UploadPhoto:
title: UploadPhoto
type: object
models.User:
title: User
type: object
properties:
Createtime:
type: string
format: datetime
Department:
type: string
Email:
type: string
Id:
type: integer
format: int64
Ip:
type: string
IsPartyMember:
type: boolean
Lastlogintime:
type: string
format: datetime
Nickname:
type: string
Password:
type: string
Port:
type: string
Remark:
type: string
Repassword:
type: string
Secoffice:
type: string
Sex:
type: string
Status:
type: integer
format: int64
Updated:
type: string
format: datetime
name:
description:
type: string
role:
description:
type: string
models.UserHistory:
title: UserHistory
type: object
properties:
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
UserTemple:
$ref: '#/definitions/models.UserTemple'
description:
type: string
faceimgurl:
type: string
pdfurl:
type: string
tempid:
type: integer
format: int32
userid:
description: (), tag `index`
type: integer
format: int64
models.UserTemple:
title: UserTemple
type: object
properties:
Article:
$ref: '#/definitions/models.MathArticle'
ClassID:
type: integer
format: int64
CreatedAt:
type: string
format: datetime
DeletedAt:
$ref: '#/definitions/gorm.DeletedAt'
ID:
type: integer
format: int32
UpdatedAt:
type: string
format: datetime
User:
$ref: '#/definitions/models.User'
UserID:
description: (), tag `index`
type: integer
format: int64
number:
type: string
path:
type: string
status:
type: boolean
title:
type: string
titleb:
description:
type: string
version:
type: string
models.ansysArticle:
title: ansysArticle
type: object
models.ansysPdf:
title: ansysPdf
type: object
models.excelArticle:
title: excelArticle
type: object
models.excelPdf:
title: excelPdf
type: object
models.pass_project:
title: pass_project
type: object
models.uploadvideo:
title: uploadvideo
type: object
success:
title: success
type: object
tags:
- name: admin
description: |
VueFlow API
- name: wx
description: |
CMSWX article API
- name: adminlog
description: |
CMSADMIN API
- name: checkin
description: |
CMSCHECKIN API
- name: bbs
description: |
CMSBBS API
- name: todo
description: |
CMSTODO API
- name: fileinput
description: |
CMSWX froala API
- name: pdfcpu
description: |
CMSADMIN API
- name: flv
description: |
CMSFLV API
- name: mathcad
description: |
CMSMATHCAD API
- name: ansys
description: |
CMSansys API
- name: excel
description: |
CMSMATHCAD API
- name: elastic
description: |
CMSELASTIC API
- name: simwe
description: |
CMSSIMWE API
- name: estimate
description: |
CMSexcel API
```
|
A presidential library, presidential center, or presidential museum is a facility either created in honor of a former president and containing their papers, or affiliated with a country's presidency.
In the United States
The presidential library system, a network of 15 government-run libraries, plus a number of privately-run ones
Jefferson Davis Presidential Library and Museum in Biloxi, Mississippi, a privately-owned library named after Jefferson Davis, president of the secessionist Confederate States
Presidential Museum and Leadership Library in Odessa, Texas, about presidents in general
World's Smallest Presidential Library in Atchison County, Kansas, a museum exhibit about David Rice Atchison, who some have claimed was Acting President of the United States for one day
In other countries
in Baku, Azerbaijan, established in 2003
in Minsk, Belarus, a research library
in Tbilisi, Georgia, named after former president Mikheil Saakashvili
South Korea:
in Seoul, named after former president Kim Dae-jung
in Gumi, named after former president Park Chung-hee
Vicente Fox Center of Studies, Library and Museum in San Cristóbal, Guanajuato, Mexico, named after former president Vicente Fox
Olusegun Obasanjo Presidential Library in Abeokuta, Ogun State, Nigeria, named after former president Chief Olusegun Obasanjo
Boris Yeltsin Presidential Library in Saint Petersburg, Russia, named after Boris Yeltsin, first president of the Russian Federation
J. R. Jayewardene Centre in Colombo, Sri Lanka, named after Junius Richard Jayewardene, first executive president of Sri Lanka
Taiwan (Republic of China):
in Taipei, named after former president Chiang Ching-kuo
Presidential and Vice-Presidential Artifacts Museum in Taipei, with archives from various presidents and vice presidents of the country
Presidential Library in Ankara, Turkey, the largest library in the country
References
|
Hot August Nights is an annual event held in Reno and Virginia City, Nevada during the first week in August. The event features classic vehicles manufactured before 1979 and is also based on Classic Rock and roll as well.
The term Hot August Nights got its name from the weather being hot as it generally is during the month of August.
History
Neil Diamond’s epic album at the Greek Theatre on August 24, 1972 brought the term “Hot August Night” to international attention. Later events also using the term were first created by Willie Ray Davison on August 1, 1986 with the intent to celebrate Rock and Roll music, America's culture in the 1950s, to increase tourism during the month and to raise money for charities. The event was first held at the Reno-Sparks Convention Center during a live event with the Righteous Brothers, Wolfman Jack, and Jan & Dean. Over the years, Hot August Nights has featured many notable hot rod, and custom car designers and builders such as George Barris, Ed "Big Daddy" Roth and Tom Daniel to the Event as well as the cast of American Graffiti. The week long event caps off on a parade of registered cars down Virginia Street on Sunday morning, the last day of the celebration.
The event branched off on its own and is now held in several locations, including Virginia City, Victorian Square in Sparks, Peppermill Hotel And Casino, Atlantis Casino Resort Spa, Grand Sierra Resort and Virginia Street in downtown Reno.
On July 5, 2010 it was reported by the Los Angeles Business Journal 1 that Hot August Nights would start an event in Long Beach, California, to be held the week before the popular Reno, Nevada event, the summer of 2011. Further, the article claimed that the event in Reno would cease to exist as of summer 2012, and the Long Beach event was to take its place. This caused significant uproar in the Reno community. Hot August Nights officials confirmed that they planned to host events in Long Beach, and that they planned to open offices in Long Beach as well. Officials, however, claimed at the time that there was no truth to the rumors that they wished to discontinue the popular Reno event. 2
The plans for a larger Long Beach, California event never came to fruition.
The event was cancelled in 2020 due to the COVID-19 pandemic.
References
External links
Los Angeles Business Journal Rock and classic cars will hit Long Beach streets as the city has lured away Hot August Nights from a Nevada city.
Tahoe Daily Tribune Hot August Nights refutes reports of move to Long Beach Tahoe Daily Tribune July 6, 2010
Annual events in Nevada
Culture of Reno, Nevada
|
```go
package import_server
import (
"context"
"github.com/werf/werf/v2/pkg/config"
)
type ImportServer interface {
GetCopyCommand(ctx context.Context, importConfig *config.Import) string
}
```
|
The Mahadev Temple in Deobaloda is dedicated to Lord Shiva in the Indian state of Chhattisgarh. This temple belongs to the Kalchuri Period. The temple is a protected monument under the Archaeological Survey of India. Temple witnesses high footfall during Mahashivratri when the devotees form nearby villages gather here for Lord Shiva’s blessing. The event is also accompanied by a small fair.
Overview
It is an ancient temple built by Kalchuris during the 13th century AD. It is said that this temple was built in just 6 months and that is why it is also called 6 Maashi (Maasi or Masi, in English - Months) temple. The temple is having a Kund and it is believed to be connected to another old town Arang of Chhattisgarh state through a tunnel
Architecture
The temple is east facing, built of sand stone. It has a Garbhagriha and a pillared Navaranga Mandapa(hall). The shikhara which is supposed to have been built in Nagara style is missing. The Garbhagriha houses a Shiva Linga of about 1.5 feet in height, which is approached through a highly ornate door entrance guarded by Shaiva Dwarpalas.Inside the garbhgriha one can find idols of Goddess Parvati, Ganesha and Hanuman among others. The mandapa pillars are adorned with images of Bhairava, Vishnu, Mahishasur mardini (a form of Devi Durga who killed the demon Mahishasura), shiva, musicians, dancers and Kirtimukha designs. The exterior of the temple near the entrance is adorned with decorated band of Gaja, Asva and Nara. The temple wall portion has two decorated segments adorned with images of Tripurantaka Siva, Gajantaka Siva, Narasimha, Radha Krishna, Ganesh, Varaha, Lakshmi along with other depictions of Gods and Goddesses. One can find pictorial representation of hunting, hunters and bull fighting on the walls of temple.
A nandi is placed in front of the temple, as if it is guarding it.
The Temple courtyard has a store house like shed where one can see antique idols and statues that have been found during excavation, which may have belonged to the temple.
Legend
It is believed that the sculptor who was building the temple got so much engrossed in his work that he didn’t care about his clothes. He became naked while continuing to work day and night to complete the temple.
His wife used to bring food for him, but one day his sister came. Seeing this they both got ashamed and to hide himself he jumped in the Kund(holy pond inside the temple complex) near to the temple from the rooftop. Seeing this his sister also jumped in the nearby pond. Both pond and kund exist till date. The pond is called Kasara Talab because the sister was believed to be carrying the Kalasha for water.
A Kalash Type stone is still present there.
The locals believe there is a secret tunnel inside the Kund that leads to a temple in Arang. The sculptor when he jumped , found the tunnel and reached Aaang where he became a stone. At that place Bhanadeva temple is built.
The Kund is having 23 steps and 2 wells beside it. In one the flow of water is never stops.
Location
The Mahadev Temple is built in the small town of Deobaloda in Bhilai Charoda.It is well connected by train and road.
By road: The temple is well connected by the National Highway by road, located 20 km from the capital Raipur city and about 15 km from Bhilai nagar.
By train: Deobloda Charoda Railway station is located just near the temple, which serves as a stop for Local and Passenger trains.
By airport: The nearest airport is Swami Vivekananda Airport Raipur.
Gallery
See also
List of Hindu temples in India
List of Shiva temples in India
References
External Links
Shiva temples in Chhattisgarh
Hindu temples in Durg district
Shiva temples
13th-century Hindu temples
Buildings and structures completed in the 13th century
|
```scala
/*
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package streaming.core.compositor.spark.udf
import org.apache.spark.SparkException
import org.apache.spark.ml.linalg.{SparseVector, Vector, Vectors}
import scala.collection.mutable.ArrayBuilder
/**
* Created by allwefantasy on 22/1/2018.
*/
object FVectors {
def assemble(vv: Any*): Vector = {
val indices = ArrayBuilder.make[Int]
val values = ArrayBuilder.make[Double]
var cur = 0
vv.foreach {
case v: Double =>
if (v != 0.0) {
indices += cur
values += v
}
cur += 1
case vec: Vector =>
vec.foreachActive { case (i, v) =>
if (v != 0.0) {
indices += cur + i
values += v
}
}
cur += vec.size
case null =>
// TODO: output Double.NaN?
throw new SparkException("Values to assemble cannot be null.")
case o =>
throw new SparkException(s"$o of type ${o.getClass.getName} is not supported.")
}
Vectors.sparse(cur, indices.result(), values.result()).compressed
}
def slice(vector: SparseVector, selectedIndices: Array[Int]): SparseVector = {
val values = vector.values
var currentIdx = 0
val (sliceInds, sliceVals) = selectedIndices.flatMap { origIdx =>
val iIdx = java.util.Arrays.binarySearch(selectedIndices, origIdx)
val i_v = if (iIdx >= 0) {
Iterator((currentIdx, values(iIdx)))
} else {
Iterator()
}
currentIdx += 1
i_v
}.unzip
new SparseVector(selectedIndices.length, sliceInds.toArray, sliceVals.toArray)
}
def range(vector: SparseVector, range: Array[Int]): SparseVector = {
val start = range(0)
val end = range(1)
val (sliceVals, sliceInds) = vector.values.zip(vector.indices).filter { d =>
val (v, i) = d
i < end && i >= start
}.unzip
new SparseVector(end - start, sliceInds.toArray, sliceVals.toArray)
}
}
```
|
Šest černých dívek aneb Proč zmizel Zajíc is a 1969 Czechoslovak film. The film starred Josef Kemr.
References
External links
1969 films
Czechoslovak crime comedy films
1960s Czech-language films
Czech crime comedy films
1960s Czech films
|
"Please Don't Break My Heart" is a song by Greek-American pop singer Kalomira, featuring American rapper Fatman Scoop. It serves as first single from her upcoming studio album and was released as a digital download on 1 May 2010. The song was produced by Toni Cottura.
Promotion
Kalomira premiered the song on the Greek version of Dancing with the Stars. She also performed the song together with Fatman Scoop at the 2010 MAD Video Music Awards.
Music video
The music video was shot in April 2010 in Istanbul, Turkey. It premiered on May 20, 2010 via Kalomira's official YouTube account. In the video, Kalomira sings to a photo of her boyfriend and walks around in a fairy-tale setting in a dress before transitioning into an urban setting in hip-hop clothing with scenes of Fatman Scoop rapping alongside her.
Charts
References
2010 singles
Kalomira songs
Fatman Scoop songs
Songs written by Terri Bjerre
2010 songs
Songs written by Toni Cottura
|
```go
/*
* path_to_url
* All Rights Reserved.
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package functions
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestLtrim(t *testing.T) {
call := Ltrim{}
res, _ := call.Call([]interface{}{"hello world", "hello "})
assert.Equal(t, "world", res)
}
```
|
```go
// Unless explicitly stated otherwise all files in this repository are licensed
// This product includes software developed at Datadog (path_to_url
package info
import (
"testing"
)
func TestPublishTraceWriterInfo(t *testing.T) {
traceWriterInfo = TraceWriterInfo{
// do not use field names here, to ensure we cover all fields
atom(1),
atom(2),
atom(3),
atom(4),
atom(5),
atom(6),
atom(7),
atom(8),
atom(9),
}
testExpvarPublish(t, publishTraceWriterInfo,
map[string]interface{}{
// all JSON numbers are floats, so the results come back as floats
"Payloads": 1.0,
"Traces": 2.0,
"Events": 3.0,
"Spans": 4.0,
"Errors": 5.0,
"Retries": 6.0,
"Bytes": 7.0,
"BytesUncompressed": 8.0,
"SingleMaxSize": 9.0,
})
}
func TestPublishStatsWriterInfo(t *testing.T) {
statsWriterInfo = StatsWriterInfo{
// do not use field names here, to ensure we cover all fields
atom(1),
atom(2),
atom(3),
atom(4),
atom(5),
atom(6),
atom(7),
atom(8),
}
testExpvarPublish(t, publishStatsWriterInfo,
map[string]interface{}{
// all JSON numbers are floats, so the results come back as floats
"Payloads": 1.0,
"ClientPayloads": 2.0,
"StatsBuckets": 3.0,
"StatsEntries": 4.0,
"Errors": 5.0,
"Retries": 6.0,
"Splits": 7.0,
"Bytes": 8.0,
})
}
func TestPublishRateByService(t *testing.T) {
rateByService = map[string]float64{"foo": 123.0}
testExpvarPublish(t, publishRateByService,
map[string]interface{}{
"foo": 123.0,
})
}
```
|
The Town of Wheaton is located in Chippewa County in the U.S. state of Wisconsin. The population was 2,701 at the 2010 census, up from 2,366 at the 2000 census. The unincorporated communities of Old Albertville and Pine Grove are located in the town.
Geography
The town of Wheaton is in the southwest corner of Chippewa County, bordered by Dunn County to the west and Eau Claire County to the south. The Chippewa River forms the southeast border of the town, across which is the city of Chippewa Falls in the north, the village of Lake Hallie in the center, and the city of Eau Claire in the south. Chippewa Falls also has a land border with Wheaton in the northeast, and Eau Claire has a land border to the south of Wheaton. According to the United States Census Bureau, the town has a total area of , of which is land and , or 0.86%, is water.
History
The squares that became the Town of Wheaton were first surveyed in the fall of 1848 by a crew working for the U.S. government. In August and September 1849 another crew marked all the section corners, walking the woods and swamps on foot, measuring with chain and compass. When done, the deputy surveyor filed this general description of the six mile square that contains the east end of Wheaton, along the river:
The land in this Township East of Chippewa Rive is very Sandy producing only Yellow & Pitch(?) Pine Timber except near the Swamps where White Pine, Birch & Elm is found. West of the River in the N.W. corner of the Township are several Sections of Aspen Thickets having a soil of sand & loam 2d rate land. but they are nearly destitute of water. Chippewa Rive is navigable for Keal Boats at all seasons of the year but the(?) rapids in Sections 15 & 22 and in Section 32 render the navigation difficult.
Demographics
As of the census of 2000, there were 2,366 people, 852 households, and 692 families residing in the town. The population density was 43.2 people per square mile (16.7/km2). There were 874 housing units at an average density of 15.9 per square mile (6.2/km2). The racial makeup of the town was 98.27% White, 0.08% African American, 0.38% Native American, 0.08% Asian, 0.17% Pacific Islander, and 1.01% from two or more races. Hispanic or Latino of any race were 0.25% of the population.
There were 852 households, out of which 36.7% had children under the age of 18 living with them, 72.3% were married couples living together, 5.4% had a female householder with no husband present, and 18.7% were non-families. 13.3% of all households were made up of individuals, and 5.2% had someone living alone who was 65 years of age or older. The average household size was 2.77 and the average family size was 3.04.
The population distribution was 26.1% under the age of 18, 6.9% from 18 to 24, 30.2% from 25 to 44, 27.8% from 45 to 64, and 9.0% who were 65 years of age or older. The median age was 38 years. For every 100 females, there were 104.5 males. For every 100 females age 18 and over, there were 103.8 males.
The median income for a household in the town was $52,692, and the median income for a family was $55,061. Males had a median income of $34,549 versus $26,176 for females. The per capita income for the town was $20,023. About 2.0% of families and 3.5% of the population were below the poverty line, including 1.5% of those under age 18 and 6.3% of those age 65 or over.
Culture
The fictional character of Recondo (G.I. Joe), part of G.I. Joe: A Real American Hero, was said to be from Wheaton.
References
External links
Town of Wheaton official website
Towns in Chippewa County, Wisconsin
Towns in Wisconsin
|
Carancahua Bay is a northern extension of Matagorda Bay located in Jackson and Matagorda counties in Texas, United States. It is oriented from the southeast to the northwest but meanders as it reaches the north to the confluence with Carancahua Creek. Generally slender, it is only about in width north of its circular mouth.
The bay serves as a nursery for shrimp and as an ecosystem for diverse species of birds and fish. Shrimp farms have been established inland to circumvent restrictions on the bay. The area close to shore is prone to flooding, and can sometimes accumulate large populations of mosquitos. As a consequence, no major settlements have been founded on the bay. However, the small communities of Port Alto and Carancahua have been established on the western and eastern shores, respectively.
History
The name Carancahua derives from the term that formerly referred to the Karankawa Indians, who resided on its shores.
Texas' Spanish Royal Governor, Martín de Alarcón was the first documented European to tour the bay while exploring Matagorda Bay with Tejas guides in 1718. During the expedition, two Karankawa Indians were spotted near the bay going about their daily lives and were frightened at the sight of Alarcón and his men. They quickly swam across the bay despite the guides' signal to them that Alarcón meant no harm. The next day, the Indians came ashore from a sixteen-passenger canoe (which could hold 4 men, 4 women and 8 children) and notified the Tejas guide that they wished for Alarcón and his men to leave. As a peace offering, Alarcón presented the Indians with tobacco and clothing on behalf of the Spanish crown, which they accepted. In exchange, the Indians offered Alarcón dried fish, and directed him toward the former French fort of St. Louis, believing Alarcón wanted to establish a colony on the bay. Alarcón declared the bay for Spain, but did not establish a permanent settlement.
Only a handful of settlements have been established on the bay. The town of Carancahua first formed as a small collection of cabins that were used in the 1880s as a stop for mail between Texana and Matagorda. However, the bay's propensity for flooding and malaria prevented growth. In fact, the bay was notorious for its swarms of mosquitos that would fly from the Colorado River delta, and documented by a late 19th-century rancher:
A fairly strong easterly wind had been blowing for three days; on the evening of the third day, the mosquitos arrived, flying high, about fifty feet, and looking like a cloud of mist over Carancahua Bay. At the ranch, they set everything on fire that had blood in it, and all work was suspended by unanimous consent...little or nothing was done for nearly five days; by this time the main body had passed, though plenty remained to make everything uncomfortable for about two weeks. This migration was from east to west and the line was about three miles wide.
Approximately 50 people lived at Carancahua in 1915, but the population dwindled to 25 in the next decade. The town remains a community, but the current population is unknown. Across the bay, a settlement of about ten permanent residents, initially known as Persimmon Point, was renamed Port Alto in 1939. The town grew, attracting retirees and vacationers who contributed to a peak summer population of 205 in 1961. Hurricane Carla destroyed the town in September 1961, but it was rebuilt five years later. In 1970, a beachfront was constructed along the shoreline as the listed population reached 170 people. The 2000 census reported that 45 people lived in the town. The original Schicke Point was home to a small ranching and farm operation. The name derives from an original resident of the point (Clarence Schicke) who came from Illinois as a game hunter and fisherman for local restaurants. After Hurricane Carla destroyed the ranch home and operations, C. Schicke turned to commercial fishing and building small cabins in the area. Schicke Point is located near the mouth of Carancahua Bay at Carancahua Pass and is a location attractive to sports fishermen. The village has approximately 90 residents, including legendary anglers Roy Cross (1924-2009), Otto Mendel (1913-2008), and fishing spoon expert Robert Cross. Cape Carancahua is a gated residential community, located on the bay's northern shore.
Features
The bay has two extensions near its mouth with Matagorda Bay at Carancahua Pass, including Redfish Lake to the southwest and Salt Lake, just above the former. On the bay's eastern shore, the mouth is headed by Schicke Point, which curves north to the Schicke Point Community, where several private piers are located. About one mile (1.6 km) inland from the community's shoreline are the Piper Lakes. North from the Schicke Point community, the El Campo Club community is found, with several residences on a straight line along the coast with docks stretched into the water. Further north, the bay takes a sharp turn to the west past a swampy area then heads north and becomes more slender as it passes from Calhoun into Jackson County. The shoreline continues north and passes several oil wells to the town of Carancahua, where a few piers are scattered along the shore. Just north of the town, a small inlet is formed, at the base of which, the Fivemile Draw is found, surrounded by swamps. To the north, several docks line the shore and continue until the bay winds to the west to a large swamp. Past the swamp, the width of the bay shrinks and continues southward along the shore of the Cape Carancahua community, surrounded by water on three sides. Past the cape, the bay turns to the north and is crossed by Texas State Highway 35. It then heads west and north again, while gradually becoming narrower until it reaches the marsh at the mouth of Carancahua Creek. The East and West Carancahua Creeks, which merge before their confluence, feed the bay. West Carancahua Creek runs south from its source near White Hall to meet with East Carancahua Creek, which runs southwest for from its source in southern Wharton County. Both streams are intermittent in their upper reaches. The western shore mimics the shape of the east. As it moves south of the Carancahua Creek mouth, Weedhaven is formed, south of which, the shore counters Cape Carancahua and heads northeast past several oil wells to a sharp point. The shoreline continues directly south until it reaches a large swamp. Past the swamp is the town of Port Alto, where several docks and piers are located.
Ecosystem
Carancahua Bay is protected by the State of Texas and locally by the 300-member Carancahua Bay Protection Association. It is a nursery bay for shrimp, and is a habitat for shellfish including oysters. Finfish such as the redfish and black drum are commonly caught from the bay by recreational fishermen. Birds common to the bay include the wood ibis, roseate spoonbill, snowy egret, great-tailed grackle, Louisiana heron, willet, black-necked stilt, crested caracara and the black vulture.
During a 2004 assessment of Texas waterways, the Texas Commission on Environmental Quality found higher than normal levels of bacteria at the mouth of Carancahua Creek and alkaline pH levels, symptomatic of algal bloom. The issues discovered by the Commission were common in the water bodies examined for the study, and they noted that such issues would be addressed.
Industry
The bay is off limits to shrimping due to its legal status as a nursery, however shrimp farms have been established and approved along its shores. The harvest of shellfish, particularly oysters, is heavily regulated, but allowed at certain times and places. Whereas the main Matagorda Bay is an approved area for shellfish production, Carancahua Bay is divided between restricted and conditionally approved areas. The Texas Department of State Health Services described the areas conditionally approved as being from the mouth of the bay to a "beige house" on the eastern shore and cutoff across to a "grey barn" on the western shore, save for a small sliver of water that includes most of the shoreline of Port Alto, which is restricted. All areas north of the diagonal line are restricted as well.
Several oil and natural gas wells are scattered throughout the shoreline and a few are included in the waters of Carancahua Bay. The most notable include the wells of the Appling Field segment, a mile offshore from Port Alto, which is believed to contain 33 billion cubic feet (0.93 billion cubic meters) of natural gas. The field was first discovered in the 1950s, but later abandoned. Brigham Exploration is working with Royale Energy to develop ten reserves in the area, spotted during a seismic survey.
References
Bays of Texas
Bodies of water of Matagorda County, Texas
Bodies of water of Jackson County, Texas
|
```c++
// (See accompanying file LICENSE.md or copy at path_to_url
#define BOOST_HANA_TEST_FOLDABLE_ITERABLE_MCD
#define BOOST_HANA_TEST_SEARCHABLE
#include <laws/templates/seq.hpp>
```
|
An erg (also sand sea or dune sea, or sand sheet if it lacks dunes) is a broad, flat area of desert covered with wind-swept sand with little or no vegetative cover. The word is derived from the Arabic word ʿarq (), meaning "dune field". Strictly speaking, an erg is defined as a desert area that contains more than of aeolian or wind-blown sand and where sand covers more than 20% of the surface. Smaller areas are known as "dune fields". The largest hot desert in the world, the Sahara, covers and contains several ergs, such as the Chech Erg and the Issaouane Erg in Algeria. Approximately 85% of all the Earth's mobile sand is found in ergs that are greater than . Ergs are also found on other celestial bodies, such as Venus, Mars, and Saturn's moon Titan.
Geography
Sand seas and dune fields generally occur in regions downwind of copious sources of dry, loose sand, such as dry riverbeds and deltas, floodplains, glacial outwash plains, dry lakes, and beaches. Ergs are concentrated in two broad belts between 20° to 40°N and 20° to 40°S latitudes, which include regions crossed by the dry, subsiding air of the trade winds. Active ergs are limited to regions that receive, on average, no more than 150 mm of annual precipitation. The largest are in northern and southern Africa, central and western Asia, and Central Australia.
In South America, ergs are limited by the Andes Mountains, but they do contain extremely large dunes in coastal Peru and northwestern Argentina. They are also found in several parts of the northeast coast of Brazil. The only active erg in North America is in the Gran Desierto de Altar that extends from the Sonoran Desert in the northwestern Mexican state of Sonora to the Yuma Desert of Arizona and the Algodones Dunes of southeastern California. An erg that has been fixed by vegetation forms the Nebraska Sandhills.
Description
Almost all major ergs are located downwind from river beds in areas that are too dry to support extensive vegetative cover and are thus subject to long-continued wind erosion. Sand from these abundant sources migrates downwind and builds up into very large dunes where its movement is halted or slowed by topographic barriers to windflow or by convergence of windflow.
Entire ergs and dune fields tend to migrate downwind as far as hundreds of kilometers from their sources of sand. Such accumulation requires long periods of time. At least one million years is required to build ergs with very large dunes, such as those on the Arabian Peninsula, in North Africa, and in central Asia. Sand seas that have accumulated in subsiding structural and topographic basins, such as the Murzuk Sand Sea of Libya, may attain great thicknesses (more than 1000 m) but others, such as the ergs of linear dunes in the Simpson Desert and Great Sandy Desert of Australia, may be no thicker than the individual dunes superposed on the alluvial plain. Within sand seas in a given area, the dunes tend to be of a single type. For example, there are ergs or fields of linear dunes, of crescentic dunes, of star dunes, and of parabolic dunes, and these dune arrays tend to have consistent orientations and sizes.
By nature, ergs are very active. Smaller dunes form and migrate along the flanks of the larger dunes and sand ridges. Occasional precipitation fills basins formed by the dunes; as the water evaporates, salt deposits are left behind.
Individual dunes in ergs typically have widths, lengths, or both dimensions greater than . Both the regional extent of their sand cover and the complexity and great size of their dunes distinguish ergs from dune fields. The depth of sand in ergs varies widely around the world, ranging from only a few centimeters deep in the Selima Sand Sheet of Southern Egypt, to approximately in the Simpson Desert, and in the Sahara. This is far shallower than ergs in prehistoric times were. Evidence in the geological record indicates that some Mesozoic and Paleozoic ergs reached a mean depth of several hundred meters.
Extraterrestrial ergs
Ergs are a geological feature that can be found on planets where an atmosphere capable of significant wind erosion acts on the surface for a significant period of time, creating sand and allowing it to accumulate.
Today at least three bodies in the Solar System, apart from Earth, are known to feature ergs on their surface: Venus, Mars and Titan.
Venus
At least two ergs have been recognized by the Magellan probe on Venus: the Aglaonice dune field, which covers approximately , and the Meshkenet dune field (~). These seem to be mostly transverse dune fields (with dune crests perpendicular to prevailing winds).
Mars
Mars shows very large ergs, especially next to the polar caps, where dunes can reach a considerable size. Ergs on Mars can exhibit strange shapes and patterns, due to complex interaction with the underlying surface and wind direction.
Titan
Radar images captured by the Cassini spacecraft as it flew by Titan in October 2005 show sand dunes at Titan's equator much like those in deserts of Earth. One erg was observed to be more than long. Dunes are a dominant landform on Titan. Approximately 15-20% of the surface is covered by ergs with an estimated total area of 12–18 million km2 making it the largest dune field coverage in the Solar System identified to date.
The sand dunes are believed to be formed by wind generated as a result of tidal forces from Saturn on Titan's atmosphere. The images are evidence that these dunes were built from winds that blow in one direction before switching to another and then back to the first direction and so on, causing the sand dunes to build up in long parallel lines. These tidal winds combined with Titan's west-to-east zonal winds create dunes aligned west-to-east nearly everywhere except close to mountains, which alter wind direction.
The sand on Titan might have formed when liquid methane rained and eroded the ice bedrock, possibly in the form of flash floods. Alternatively, the sand could also have come from organic solids produced by photochemical reactions in Titan's atmosphere.
See also
References
Dunes
Aeolian landforms
Erosion landforms
|
```swift
@testable import SwifterSwift
import XCTest
#if canImport(UIKit) && !os(watchOS)
import UIKit
final class UINavigationBarExtensionsTests: XCTestCase {
func testSetTitleFont() {
let navigationBar = UINavigationBar()
let helveticaFont = UIFont(name: "HelveticaNeue", size: 14)!
navigationBar.setTitleFont(helveticaFont, color: .green)
if #available(iOS 13.0, tvOS 13.0, *) {
let color = navigationBar.standardAppearance
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(color, .green)
let font = navigationBar.standardAppearance.titleTextAttributes[NSAttributedString.Key.font] as? UIFont
XCTAssertEqual(font, helveticaFont)
navigationBar.setTitleFont(helveticaFont)
let defaultColor = navigationBar.standardAppearance
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(defaultColor, .black)
} else {
let color = navigationBar.titleTextAttributes?[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(color, .green)
let font = navigationBar.titleTextAttributes?[NSAttributedString.Key.font] as? UIFont
XCTAssertEqual(font, helveticaFont)
navigationBar.setTitleFont(helveticaFont)
let defaultColor = navigationBar.titleTextAttributes?[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(defaultColor, .black)
}
}
func testMakeTransparent() {
let navigationBar = UINavigationBar()
navigationBar.makeTransparent(withTint: .red)
let legacyTests = {
XCTAssertNotNil(navigationBar.backgroundImage(for: .default))
XCTAssertNotNil(navigationBar.shadowImage)
XCTAssert(navigationBar.isTranslucent)
XCTAssertEqual(navigationBar.tintColor, .red)
let color = navigationBar.titleTextAttributes?[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(color, .red)
navigationBar.makeTransparent()
let defaultColor = navigationBar.titleTextAttributes?[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(defaultColor, .white)
}
#if os(tvOS)
legacyTests()
#else
if #available(iOS 13.0, *) {
XCTAssertEqual(navigationBar.tintColor, .red)
let standardAppearanceColor = navigationBar.standardAppearance
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
let scrollEdgeAppearanceColor = navigationBar.scrollEdgeAppearance?
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
let compactAppearanceColor = navigationBar.compactAppearance?
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(standardAppearanceColor, .red)
XCTAssertEqual(scrollEdgeAppearanceColor, .red)
XCTAssertEqual(compactAppearanceColor, .red)
navigationBar.makeTransparent()
let standardAppearanceDefaultColor = navigationBar.standardAppearance
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
let scrollEdgeAppearanceDefaultColor = navigationBar.scrollEdgeAppearance?
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
let compactAppearanceDefaultColor = navigationBar.compactAppearance?
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(standardAppearanceDefaultColor, .white)
XCTAssertEqual(scrollEdgeAppearanceDefaultColor, .white)
XCTAssertEqual(compactAppearanceDefaultColor, .white)
} else {
legacyTests()
}
#endif
}
func testSetColors() {
let navigationBar = UINavigationBar()
navigationBar.setColors(background: .blue, text: .green)
let legacyTests = {
XCTAssertFalse(navigationBar.isTranslucent)
XCTAssertEqual(navigationBar.backgroundColor, .blue)
XCTAssertEqual(navigationBar.barTintColor, .blue)
XCTAssertNotNil(navigationBar.backgroundImage(for: .default))
XCTAssertEqual(navigationBar.tintColor, .green)
let color = navigationBar.titleTextAttributes?[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(color, .green)
}
#if os(tvOS)
legacyTests()
#else
if #available(iOS 13.0, *) {
XCTAssertEqual(navigationBar.tintColor, .green)
let standardAppearanceBackgroundColor = navigationBar.standardAppearance.backgroundColor
let scrollEdgeAppearanceBackgroundColor = navigationBar.scrollEdgeAppearance?.backgroundColor
let compactAppearanceBackgroundColor = navigationBar.compactAppearance?.backgroundColor
XCTAssertEqual(standardAppearanceBackgroundColor, .blue)
XCTAssertEqual(scrollEdgeAppearanceBackgroundColor, .blue)
XCTAssertEqual(compactAppearanceBackgroundColor, .blue)
let standardAppearanceTextColor = navigationBar.standardAppearance
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
let scrollEdgeAppearanceTextColor = navigationBar.scrollEdgeAppearance?
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
let compactAppearanceTextColor = navigationBar.compactAppearance?
.titleTextAttributes[NSAttributedString.Key.foregroundColor] as? UIColor
XCTAssertEqual(standardAppearanceTextColor, .green)
XCTAssertEqual(scrollEdgeAppearanceTextColor, .green)
XCTAssertEqual(compactAppearanceTextColor, .green)
} else {
legacyTests()
}
#endif
}
}
#endif
```
|
Sithembile Xola Pearl Thusi (born 13 May 1988) is a South African actress, model, and presenter. She is known for her roles as Patricia Kopong in the BBC/HBO comedy-drama series The No. 1 Ladies' Detective Agency, Dayana Mampasi in the ABC thriller Quantico and Samkelo in the romance film Catching Feelings. In 2020, she starred in the title role of Netflix's first African original series, Queen Sono.
Early life and education
Thusi is from townships of KwaNdengezi and Hammarsdale just outside Durban. She has two sisters. She attended Pinetown Girls' High School. She began her studies at the University of the Witwatersrand, but pulled out to make time for her career. In 2020, she resumed studies at the University of South Africa.
Career
Thusi is the host of Lip Sync Battle Africa on MTV and e.tv, as well as the talk show Moments, on EbonyLife TV. She has starred on the SABC 3 soap opera Isidingo, as Palesa Motaung, co-hosted Live Amp with DJ Warras and Luthando Shosha, and the SABC 1 celebrity gossip magazine show Real Goboza.
In 2009, Thusi starred as Patricia Kopong on the BBC/HBO comedy-drama The No. 1 Ladies' Detective Agency.
In 2015, Thusi co-starred as Dr. Nandi Montabu in Tremors 5: Bloodlines. She also appeared in a music video entitled "Pearl Thusi" by rapper Emtee.
In 2016, Thusi was cast as a series regular in the role of Dayana Mampasi on the second season of the ABC thriller series Quantico, opposite Priyanka Chopra. In the same year, Thusi was cast as Samkelo in the romantic drama film Catching Feelings. The film was released in theaters on 9 March 2018.
In 2017, Thusi starred as Brenda Riviera in the drama film Kalushi.
In 2018, Thusi became the new host of the third season of MTV Base's Behind the Story. In the same year, Thusi was cast in the lead role of Queen Sono on the Netflix crime drama series Queen Sono. The series premiered on 28 February 2020 and was widely acclaimed by critics, and Thusi's performance in particular was singled out for praise. In April 2020, the series was renewed by Netflix for a second season. However, on 26 November 2020, it was reported that Netflix has cancelled the series because of the production challenges brought on by the COVID-19 pandemic.
On 15 December 2020, She became a co-host of 1st KZN Entertainment Awards alongside Somizi Mhlongo.
In February 2021, Thusi was cast as Zama Zulu in the Netflix film Fistful of Vengeance. It was released on 17 February 2022.
Filmography
Film
Television
References
External links
TVSA Actor Profile
1988 births
Living people
South African actresses
Actresses from Durban
Actors from KwaZulu-Natal
|
```css
/*!
* Bootstrap-select v1.10.0 (path_to_url
*
*/select.bs-select-hidden,select.selectpicker{display:none!important}.bootstrap-select{width:220px\9}.bootstrap-select>.dropdown-toggle{width:100%;padding-right:25px;z-index:1}.bootstrap-select>select{position:absolute!important;bottom:0;left:50%;display:block!important;width:.5px!important;height:100%!important;padding:0!important;opacity:0!important;border:none}.bootstrap-select>select.mobile-device{top:0;left:0;display:block!important;width:100%!important;z-index:2}.error .bootstrap-select .dropdown-toggle,.has-error .bootstrap-select .dropdown-toggle{border-color:#b94a48}.bootstrap-select.fit-width{width:auto!important}.bootstrap-select:not([class*=col-]):not([class*=form-control]):not(.input-group-btn){width:220px}.bootstrap-select .dropdown-toggle:focus{outline:thin dotted #333!important;outline:5px auto -webkit-focus-ring-color!important;outline-offset:-2px}.bootstrap-select.form-control{margin-bottom:0;padding:0;border:none}.bootstrap-select.form-control:not([class*=col-]){width:100%}.bootstrap-select.form-control.input-group-btn{z-index:auto}.bootstrap-select.btn-group:not(.input-group-btn),.bootstrap-select.btn-group[class*=col-]{float:none;display:inline-block;margin-left:0}.bootstrap-select.btn-group.dropdown-menu-right,.bootstrap-select.btn-group[class*=col-].dropdown-menu-right,.row .bootstrap-select.btn-group[class*=col-].dropdown-menu-right{float:right}.form-group .bootstrap-select.btn-group,.form-horizontal .bootstrap-select.btn-group,.form-inline .bootstrap-select.btn-group{margin-bottom:0}.form-group-lg .bootstrap-select.btn-group.form-control,.form-group-sm .bootstrap-select.btn-group.form-control{padding:0}.form-inline .bootstrap-select.btn-group .form-control{width:100%}.bootstrap-select.btn-group.disabled,.bootstrap-select.btn-group>.disabled{cursor:not-allowed}.bootstrap-select.btn-group.disabled:focus,.bootstrap-select.btn-group>.disabled:focus{outline:0!important}.bootstrap-select.btn-group.bs-container{position:absolute}.bootstrap-select.btn-group.bs-container .dropdown-menu{z-index:1060}.bootstrap-select.btn-group .dropdown-toggle .filter-option{display:inline-block;overflow:hidden;width:100%;text-align:left}.bootstrap-select.btn-group .dropdown-toggle .caret{position:absolute;top:50%;right:12px;margin-top:-2px;vertical-align:middle}.bootstrap-select.btn-group[class*=col-] .dropdown-toggle{width:100%}.bootstrap-select.btn-group .dropdown-menu{min-width:100%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.bootstrap-select.btn-group .dropdown-menu.inner{position:static;float:none;border:0;padding:0;margin:0;border-radius:0;-webkit-box-shadow:none;box-shadow:none}.bootstrap-select.btn-group .dropdown-menu li{position:relative}.bootstrap-select.btn-group .dropdown-menu li.active small{color:#fff}.bootstrap-select.btn-group .dropdown-menu li.disabled a{cursor:not-allowed}.bootstrap-select.btn-group .dropdown-menu li a{cursor:pointer;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.bootstrap-select.btn-group .dropdown-menu li a.opt{position:relative;padding-left:2.25em}.bootstrap-select.btn-group .dropdown-menu li a span.check-mark{display:none}.bootstrap-select.btn-group .dropdown-menu li a span.text{display:inline-block}.bootstrap-select.btn-group .dropdown-menu li small{padding-left:.5em}.bootstrap-select.btn-group .dropdown-menu .notify{position:absolute;bottom:5px;width:96%;margin:0 2%;min-height:26px;padding:3px 5px;background:#f5f5f5;border:1px solid #e3e3e3;-webkit-box-shadow:inset 0 1px 1px rgba(0,0,0,.05);box-shadow:inset 0 1px 1px rgba(0,0,0,.05);pointer-events:none;opacity:.9;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.bootstrap-select.btn-group .no-results{padding:3px;background:#f5f5f5;margin:0 5px;white-space:nowrap}.bootstrap-select.btn-group.fit-width .dropdown-toggle .filter-option{position:static}.bootstrap-select.btn-group.fit-width .dropdown-toggle .caret{position:static;top:auto;margin-top:-1px}.bootstrap-select.btn-group.show-tick .dropdown-menu li.selected a span.check-mark{position:absolute;display:inline-block;right:15px;margin-top:5px}.bootstrap-select.btn-group.show-tick .dropdown-menu li a span.text{margin-right:34px}.bootstrap-select.show-menu-arrow.open>.dropdown-toggle{z-index:1061}.bootstrap-select.show-menu-arrow .dropdown-toggle:before{content:'';border-left:7px solid transparent;border-right:7px solid transparent;border-bottom:7px solid rgba(204,204,204,.2);position:absolute;bottom:-4px;left:9px;display:none}.bootstrap-select.show-menu-arrow .dropdown-toggle:after{content:'';border-left:6px solid transparent;border-right:6px solid transparent;border-bottom:6px solid #fff;position:absolute;bottom:-4px;left:10px;display:none}.bootstrap-select.show-menu-arrow.dropup .dropdown-toggle:before{bottom:auto;top:-3px;border-top:7px solid rgba(204,204,204,.2);border-bottom:0}.bootstrap-select.show-menu-arrow.dropup .dropdown-toggle:after{bottom:auto;top:-3px;border-top:6px solid #fff;border-bottom:0}.bootstrap-select.show-menu-arrow.pull-right .dropdown-toggle:before{right:12px;left:auto}.bootstrap-select.show-menu-arrow.pull-right .dropdown-toggle:after{right:13px;left:auto}.bootstrap-select.show-menu-arrow.open>.dropdown-toggle:after,.bootstrap-select.show-menu-arrow.open>.dropdown-toggle:before{display:block}.bs-actionsbox,.bs-donebutton,.bs-searchbox{padding:4px 8px}.bs-actionsbox{width:100%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.bs-actionsbox .btn-group button{width:50%}.bs-donebutton{float:left;width:100%;-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}.bs-donebutton .btn-group button{width:100%}.bs-searchbox+.bs-actionsbox{padding:0 8px 4px}.bs-searchbox .form-control{margin-bottom:0;width:100%;float:none}
```
|
```c++
/*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. Neither the name of the copyright holder nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/
/**
* @file
* This file includes definitions for Thread Link Metrics.
*/
#include "link_metrics.hpp"
#if OPENTHREAD_CONFIG_MLE_LINK_METRICS_INITIATOR_ENABLE || OPENTHREAD_CONFIG_MLE_LINK_METRICS_SUBJECT_ENABLE
#include "common/code_utils.hpp"
#include "common/encoding.hpp"
#include "common/locator_getters.hpp"
#include "common/log.hpp"
#include "common/num_utils.hpp"
#include "common/numeric_limits.hpp"
#include "instance/instance.hpp"
#include "mac/mac.hpp"
#include "thread/link_metrics_tlvs.hpp"
#include "thread/neighbor_table.hpp"
namespace ot {
namespace LinkMetrics {
RegisterLogModule("LinkMetrics");
static constexpr uint8_t kQueryIdSingleProbe = 0; // This query ID represents Single Probe.
static constexpr uint8_t kSeriesIdAllSeries = 255; // This series ID represents all series.
// Constants for scaling Link Margin and RSSI to raw value
static constexpr uint8_t kMaxLinkMargin = 130;
static constexpr int32_t kMinRssi = -130;
static constexpr int32_t kMaxRssi = 0;
#if OPENTHREAD_CONFIG_MLE_LINK_METRICS_INITIATOR_ENABLE
Initiator::Initiator(Instance &aInstance)
: InstanceLocator(aInstance)
{
}
Error Initiator::Query(const Ip6::Address &aDestination, uint8_t aSeriesId, const Metrics *aMetrics)
{
Error error;
Neighbor *neighbor;
QueryInfo info;
SuccessOrExit(error = FindNeighbor(aDestination, neighbor));
info.Clear();
info.mSeriesId = aSeriesId;
if (aMetrics != nullptr)
{
info.mTypeIdCount = aMetrics->ConvertToTypeIds(info.mTypeIds);
}
if (aSeriesId != 0)
{
VerifyOrExit(info.mTypeIdCount == 0, error = kErrorInvalidArgs);
}
error = Get<Mle::Mle>().SendDataRequestForLinkMetricsReport(aDestination, info);
exit:
return error;
}
Error Initiator::AppendLinkMetricsQueryTlv(Message &aMessage, const QueryInfo &aInfo)
{
Error error = kErrorNone;
Tlv tlv;
// The MLE Link Metrics Query TLV has two sub-TLVs:
// - Query ID sub-TLV with series ID as value.
// - Query Options sub-TLV with Type IDs as value.
tlv.SetType(Mle::Tlv::kLinkMetricsQuery);
tlv.SetLength(sizeof(Tlv) + sizeof(uint8_t) + ((aInfo.mTypeIdCount == 0) ? 0 : (sizeof(Tlv) + aInfo.mTypeIdCount)));
SuccessOrExit(error = aMessage.Append(tlv));
SuccessOrExit(error = Tlv::Append<QueryIdSubTlv>(aMessage, aInfo.mSeriesId));
if (aInfo.mTypeIdCount != 0)
{
QueryOptionsSubTlv queryOptionsTlv;
queryOptionsTlv.Init();
queryOptionsTlv.SetLength(aInfo.mTypeIdCount);
SuccessOrExit(error = aMessage.Append(queryOptionsTlv));
SuccessOrExit(error = aMessage.AppendBytes(aInfo.mTypeIds, aInfo.mTypeIdCount));
}
exit:
return error;
}
void Initiator::HandleReport(const Message &aMessage, OffsetRange &aOffsetRange, const Ip6::Address &aAddress)
{
Error error = kErrorNone;
bool hasStatus = false;
bool hasReport = false;
Tlv::ParsedInfo tlvInfo;
ReportSubTlv reportTlv;
MetricsValues values;
uint8_t status;
uint8_t typeId;
OT_UNUSED_VARIABLE(error);
VerifyOrExit(mReportCallback.IsSet());
values.Clear();
for (; !aOffsetRange.IsEmpty(); aOffsetRange.AdvanceOffset(tlvInfo.GetSize()))
{
SuccessOrExit(error = tlvInfo.ParseFrom(aMessage, aOffsetRange));
if (tlvInfo.mIsExtended)
{
continue;
}
// The report must contain either:
// - One or more Report Sub-TLVs (in case of success), or
// - A single Status Sub-TLV (in case of failure).
switch (tlvInfo.mType)
{
case StatusSubTlv::kType:
VerifyOrExit(!hasStatus && !hasReport, error = kErrorDrop);
SuccessOrExit(error = Tlv::Read<StatusSubTlv>(aMessage, aOffsetRange.GetOffset(), status));
hasStatus = true;
break;
case ReportSubTlv::kType:
VerifyOrExit(!hasStatus, error = kErrorDrop);
// Read the report sub-TLV assuming minimum length
SuccessOrExit(error = aMessage.Read(aOffsetRange, &reportTlv, sizeof(Tlv) + ReportSubTlv::kMinLength));
VerifyOrExit(reportTlv.IsValid(), error = kErrorParse);
hasReport = true;
typeId = reportTlv.GetMetricsTypeId();
if (TypeId::IsExtended(typeId))
{
// Skip the sub-TLV if `E` flag is set.
break;
}
if (TypeId::GetValueLength(typeId) > sizeof(uint8_t))
{
// If Type ID indicates metric value has 4 bytes length, we
// read the full `reportTlv`.
SuccessOrExit(error = aMessage.Read(aOffsetRange.GetOffset(), reportTlv));
}
switch (typeId)
{
case TypeId::kPdu:
values.mMetrics.mPduCount = true;
values.mPduCountValue = reportTlv.GetMetricsValue32();
LogDebg(" - PDU Counter: %lu (Count/Summation)", ToUlong(values.mPduCountValue));
break;
case TypeId::kLqi:
values.mMetrics.mLqi = true;
values.mLqiValue = reportTlv.GetMetricsValue8();
LogDebg(" - LQI: %u (Exponential Moving Average)", values.mLqiValue);
break;
case TypeId::kLinkMargin:
values.mMetrics.mLinkMargin = true;
values.mLinkMarginValue = ScaleRawValueToLinkMargin(reportTlv.GetMetricsValue8());
LogDebg(" - Margin: %u (dB) (Exponential Moving Average)", values.mLinkMarginValue);
break;
case TypeId::kRssi:
values.mMetrics.mRssi = true;
values.mRssiValue = ScaleRawValueToRssi(reportTlv.GetMetricsValue8());
LogDebg(" - RSSI: %u (dBm) (Exponential Moving Average)", values.mRssiValue);
break;
}
break;
}
}
VerifyOrExit(hasStatus || hasReport);
mReportCallback.Invoke(&aAddress, hasStatus ? nullptr : &values,
hasStatus ? MapEnum(static_cast<Status>(status)) : MapEnum(kStatusSuccess));
exit:
LogDebg("HandleReport, error:%s", ErrorToString(error));
}
Error Initiator::SendMgmtRequestForwardTrackingSeries(const Ip6::Address &aDestination,
uint8_t aSeriesId,
const SeriesFlags &aSeriesFlags,
const Metrics *aMetrics)
{
Error error;
Neighbor *neighbor;
uint8_t typeIdCount = 0;
FwdProbingRegSubTlv fwdProbingSubTlv;
SuccessOrExit(error = FindNeighbor(aDestination, neighbor));
VerifyOrExit(aSeriesId > kQueryIdSingleProbe, error = kErrorInvalidArgs);
fwdProbingSubTlv.Init();
fwdProbingSubTlv.SetSeriesId(aSeriesId);
fwdProbingSubTlv.SetSeriesFlagsMask(aSeriesFlags.ConvertToMask());
if (aMetrics != nullptr)
{
typeIdCount = aMetrics->ConvertToTypeIds(fwdProbingSubTlv.GetTypeIds());
}
fwdProbingSubTlv.SetLength(sizeof(aSeriesId) + sizeof(uint8_t) + typeIdCount);
error = Get<Mle::Mle>().SendLinkMetricsManagementRequest(aDestination, fwdProbingSubTlv);
exit:
LogDebg("SendMgmtRequestForwardTrackingSeries, error:%s, Series ID:%u", ErrorToString(error), aSeriesId);
return error;
}
Error Initiator::SendMgmtRequestEnhAckProbing(const Ip6::Address &aDestination,
EnhAckFlags aEnhAckFlags,
const Metrics *aMetrics)
{
Error error;
Neighbor *neighbor;
uint8_t typeIdCount = 0;
EnhAckConfigSubTlv enhAckConfigSubTlv;
SuccessOrExit(error = FindNeighbor(aDestination, neighbor));
if (aEnhAckFlags == kEnhAckClear)
{
VerifyOrExit(aMetrics == nullptr, error = kErrorInvalidArgs);
}
enhAckConfigSubTlv.Init();
enhAckConfigSubTlv.SetEnhAckFlags(aEnhAckFlags);
if (aMetrics != nullptr)
{
typeIdCount = aMetrics->ConvertToTypeIds(enhAckConfigSubTlv.GetTypeIds());
}
enhAckConfigSubTlv.SetLength(EnhAckConfigSubTlv::kMinLength + typeIdCount);
error = Get<Mle::Mle>().SendLinkMetricsManagementRequest(aDestination, enhAckConfigSubTlv);
if (aMetrics != nullptr)
{
neighbor->SetEnhAckProbingMetrics(*aMetrics);
}
else
{
Metrics metrics;
metrics.Clear();
neighbor->SetEnhAckProbingMetrics(metrics);
}
exit:
return error;
}
Error Initiator::HandleManagementResponse(const Message &aMessage, const Ip6::Address &aAddress)
{
Error error = kErrorNone;
OffsetRange offsetRange;
Tlv::ParsedInfo tlvInfo;
uint8_t status;
bool hasStatus = false;
VerifyOrExit(mMgmtResponseCallback.IsSet());
SuccessOrExit(error = Tlv::FindTlvValueOffsetRange(aMessage, Mle::Tlv::Type::kLinkMetricsManagement, offsetRange));
for (; !offsetRange.IsEmpty(); offsetRange.AdvanceOffset(tlvInfo.GetSize()))
{
SuccessOrExit(error = tlvInfo.ParseFrom(aMessage, offsetRange));
if (tlvInfo.mIsExtended)
{
continue;
}
switch (tlvInfo.mType)
{
case StatusSubTlv::kType:
VerifyOrExit(!hasStatus, error = kErrorParse);
SuccessOrExit(error = Tlv::Read<StatusSubTlv>(aMessage, offsetRange.GetOffset(), status));
hasStatus = true;
break;
default:
break;
}
}
VerifyOrExit(hasStatus, error = kErrorParse);
mMgmtResponseCallback.Invoke(&aAddress, MapEnum(static_cast<Status>(status)));
exit:
return error;
}
Error Initiator::SendLinkProbe(const Ip6::Address &aDestination, uint8_t aSeriesId, uint8_t aLength)
{
Error error;
uint8_t buf[kLinkProbeMaxLen];
Neighbor *neighbor;
SuccessOrExit(error = FindNeighbor(aDestination, neighbor));
VerifyOrExit(aLength <= kLinkProbeMaxLen && aSeriesId != kQueryIdSingleProbe && aSeriesId != kSeriesIdAllSeries,
error = kErrorInvalidArgs);
error = Get<Mle::Mle>().SendLinkProbe(aDestination, aSeriesId, buf, aLength);
exit:
LogDebg("SendLinkProbe, error:%s, Series ID:%u", ErrorToString(error), aSeriesId);
return error;
}
void Initiator::ProcessEnhAckIeData(const uint8_t *aData, uint8_t aLength, const Neighbor &aNeighbor)
{
MetricsValues values;
uint8_t idx = 0;
VerifyOrExit(mEnhAckProbingIeReportCallback.IsSet());
values.SetMetrics(aNeighbor.GetEnhAckProbingMetrics());
if (values.GetMetrics().mLqi && idx < aLength)
{
values.mLqiValue = aData[idx++];
}
if (values.GetMetrics().mLinkMargin && idx < aLength)
{
values.mLinkMarginValue = ScaleRawValueToLinkMargin(aData[idx++]);
}
if (values.GetMetrics().mRssi && idx < aLength)
{
values.mRssiValue = ScaleRawValueToRssi(aData[idx++]);
}
mEnhAckProbingIeReportCallback.Invoke(aNeighbor.GetRloc16(), &aNeighbor.GetExtAddress(), &values);
exit:
return;
}
Error Initiator::FindNeighbor(const Ip6::Address &aDestination, Neighbor *&aNeighbor)
{
Error error = kErrorUnknownNeighbor;
Mac::Address macAddress;
aNeighbor = nullptr;
VerifyOrExit(aDestination.IsLinkLocalUnicast());
aDestination.GetIid().ConvertToMacAddress(macAddress);
aNeighbor = Get<NeighborTable>().FindNeighbor(macAddress);
VerifyOrExit(aNeighbor != nullptr);
VerifyOrExit(aNeighbor->GetVersion() >= kThreadVersion1p2, error = kErrorNotCapable);
error = kErrorNone;
exit:
return error;
}
#endif // OPENTHREAD_CONFIG_MLE_LINK_METRICS_INITIATOR_ENABLE
#if OPENTHREAD_CONFIG_MLE_LINK_METRICS_SUBJECT_ENABLE
Subject::Subject(Instance &aInstance)
: InstanceLocator(aInstance)
{
}
Error Subject::AppendReport(Message &aMessage, const Message &aRequestMessage, Neighbor &aNeighbor)
{
Error error = kErrorNone;
Tlv tlv;
Tlv::ParsedInfo tlvInfo;
uint8_t queryId;
bool hasQueryId = false;
uint16_t length;
uint16_t offset;
OffsetRange offsetRange;
MetricsValues values;
values.Clear();
// - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
// Parse MLE Link Metrics Query TLV and its sub-TLVs from
// `aRequestMessage`.
SuccessOrExit(error =
Tlv::FindTlvValueOffsetRange(aRequestMessage, Mle::Tlv::Type::kLinkMetricsQuery, offsetRange));
for (; !offsetRange.IsEmpty(); offsetRange.AdvanceOffset(tlvInfo.GetSize()))
{
SuccessOrExit(error = tlvInfo.ParseFrom(aRequestMessage, offsetRange));
if (tlvInfo.mIsExtended)
{
continue;
}
switch (tlvInfo.mType)
{
case SubTlv::kQueryId:
SuccessOrExit(error =
Tlv::Read<QueryIdSubTlv>(aRequestMessage, tlvInfo.mTlvOffsetRange.GetOffset(), queryId));
hasQueryId = true;
break;
case SubTlv::kQueryOptions:
SuccessOrExit(error =
ReadTypeIdsFromMessage(aRequestMessage, tlvInfo.mValueOffsetRange, values.GetMetrics()));
break;
default:
break;
}
}
VerifyOrExit(hasQueryId, error = kErrorParse);
// - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
// Append MLE Link Metrics Report TLV and its sub-TLVs to
// `aMessage`.
offset = aMessage.GetLength();
tlv.SetType(Mle::Tlv::kLinkMetricsReport);
SuccessOrExit(error = aMessage.Append(tlv));
if (queryId == kQueryIdSingleProbe)
{
values.mPduCountValue = aRequestMessage.GetPsduCount();
values.mLqiValue = aRequestMessage.GetAverageLqi();
values.mLinkMarginValue = Get<Mac::Mac>().ComputeLinkMargin(aRequestMessage.GetAverageRss());
values.mRssiValue = aRequestMessage.GetAverageRss();
SuccessOrExit(error = AppendReportSubTlvToMessage(aMessage, values));
}
else
{
SeriesInfo *seriesInfo = aNeighbor.GetForwardTrackingSeriesInfo(queryId);
if (seriesInfo == nullptr)
{
SuccessOrExit(error = Tlv::Append<StatusSubTlv>(aMessage, kStatusSeriesIdNotRecognized));
}
else if (seriesInfo->GetPduCount() == 0)
{
SuccessOrExit(error = Tlv::Append<StatusSubTlv>(aMessage, kStatusNoMatchingFramesReceived));
}
else
{
values.SetMetrics(seriesInfo->GetLinkMetrics());
values.mPduCountValue = seriesInfo->GetPduCount();
values.mLqiValue = seriesInfo->GetAverageLqi();
values.mLinkMarginValue = Get<Mac::Mac>().ComputeLinkMargin(seriesInfo->GetAverageRss());
values.mRssiValue = seriesInfo->GetAverageRss();
SuccessOrExit(error = AppendReportSubTlvToMessage(aMessage, values));
}
}
// Update the TLV length in message.
length = aMessage.GetLength() - offset - sizeof(Tlv);
tlv.SetLength(static_cast<uint8_t>(length));
aMessage.Write(offset, tlv);
exit:
LogDebg("AppendReport, error:%s", ErrorToString(error));
return error;
}
Error Subject::HandleManagementRequest(const Message &aMessage, Neighbor &aNeighbor, Status &aStatus)
{
Error error = kErrorNone;
OffsetRange offsetRange;
Tlv::ParsedInfo tlvInfo;
FwdProbingRegSubTlv fwdProbingSubTlv;
EnhAckConfigSubTlv enhAckConfigSubTlv;
Metrics metrics;
SuccessOrExit(error = Tlv::FindTlvValueOffsetRange(aMessage, Mle::Tlv::Type::kLinkMetricsManagement, offsetRange));
// Set sub-TLV lengths to zero to indicate that we have
// not yet seen them in the message.
fwdProbingSubTlv.SetLength(0);
enhAckConfigSubTlv.SetLength(0);
for (; !offsetRange.IsEmpty(); offsetRange.AdvanceOffset(tlvInfo.GetSize()))
{
uint16_t minTlvSize;
Tlv *subTlv;
OffsetRange tlvOffsetRange;
SuccessOrExit(error = tlvInfo.ParseFrom(aMessage, offsetRange));
if (tlvInfo.mIsExtended)
{
continue;
}
tlvOffsetRange = tlvInfo.mTlvOffsetRange;
switch (tlvInfo.mType)
{
case SubTlv::kFwdProbingReg:
subTlv = &fwdProbingSubTlv;
minTlvSize = sizeof(Tlv) + FwdProbingRegSubTlv::kMinLength;
break;
case SubTlv::kEnhAckConfig:
subTlv = &enhAckConfigSubTlv;
minTlvSize = sizeof(Tlv) + EnhAckConfigSubTlv::kMinLength;
break;
default:
continue;
}
// Ensure message contains only one sub-TLV.
VerifyOrExit(fwdProbingSubTlv.GetLength() == 0, error = kErrorParse);
VerifyOrExit(enhAckConfigSubTlv.GetLength() == 0, error = kErrorParse);
VerifyOrExit(tlvInfo.GetSize() >= minTlvSize, error = kErrorParse);
// Read `subTlv` with its `minTlvSize`, followed by the Type IDs.
SuccessOrExit(error = aMessage.Read(tlvOffsetRange, subTlv, minTlvSize));
tlvOffsetRange.AdvanceOffset(minTlvSize);
SuccessOrExit(error = ReadTypeIdsFromMessage(aMessage, tlvOffsetRange, metrics));
}
if (fwdProbingSubTlv.GetLength() != 0)
{
aStatus = ConfigureForwardTrackingSeries(fwdProbingSubTlv.GetSeriesId(), fwdProbingSubTlv.GetSeriesFlagsMask(),
metrics, aNeighbor);
}
if (enhAckConfigSubTlv.GetLength() != 0)
{
aStatus = ConfigureEnhAckProbing(enhAckConfigSubTlv.GetEnhAckFlags(), metrics, aNeighbor);
}
exit:
return error;
}
Error Subject::HandleLinkProbe(const Message &aMessage, uint8_t &aSeriesId)
{
Error error = kErrorNone;
OffsetRange offsetRange;
SuccessOrExit(error = Tlv::FindTlvValueOffsetRange(aMessage, Mle::Tlv::Type::kLinkProbe, offsetRange));
error = aMessage.Read(offsetRange, aSeriesId);
exit:
return error;
}
Error Subject::AppendReportSubTlvToMessage(Message &aMessage, const MetricsValues &aValues)
{
Error error = kErrorNone;
ReportSubTlv reportTlv;
reportTlv.Init();
if (aValues.mMetrics.mPduCount)
{
reportTlv.SetMetricsTypeId(TypeId::kPdu);
reportTlv.SetMetricsValue32(aValues.mPduCountValue);
SuccessOrExit(error = reportTlv.AppendTo(aMessage));
}
if (aValues.mMetrics.mLqi)
{
reportTlv.SetMetricsTypeId(TypeId::kLqi);
reportTlv.SetMetricsValue8(aValues.mLqiValue);
SuccessOrExit(error = reportTlv.AppendTo(aMessage));
}
if (aValues.mMetrics.mLinkMargin)
{
reportTlv.SetMetricsTypeId(TypeId::kLinkMargin);
reportTlv.SetMetricsValue8(ScaleLinkMarginToRawValue(aValues.mLinkMarginValue));
SuccessOrExit(error = reportTlv.AppendTo(aMessage));
}
if (aValues.mMetrics.mRssi)
{
reportTlv.SetMetricsTypeId(TypeId::kRssi);
reportTlv.SetMetricsValue8(ScaleRssiToRawValue(aValues.mRssiValue));
SuccessOrExit(error = reportTlv.AppendTo(aMessage));
}
exit:
return error;
}
void Subject::Free(SeriesInfo &aSeriesInfo) { mSeriesInfoPool.Free(aSeriesInfo); }
Error Subject::ReadTypeIdsFromMessage(const Message &aMessage, const OffsetRange &aOffsetRange, Metrics &aMetrics)
{
Error error = kErrorNone;
OffsetRange offsetRange = aOffsetRange;
aMetrics.Clear();
while (!offsetRange.IsEmpty())
{
uint8_t typeId;
SuccessOrExit(aMessage.Read(offsetRange, typeId));
switch (typeId)
{
case TypeId::kPdu:
VerifyOrExit(!aMetrics.mPduCount, error = kErrorParse);
aMetrics.mPduCount = true;
break;
case TypeId::kLqi:
VerifyOrExit(!aMetrics.mLqi, error = kErrorParse);
aMetrics.mLqi = true;
break;
case TypeId::kLinkMargin:
VerifyOrExit(!aMetrics.mLinkMargin, error = kErrorParse);
aMetrics.mLinkMargin = true;
break;
case TypeId::kRssi:
VerifyOrExit(!aMetrics.mRssi, error = kErrorParse);
aMetrics.mRssi = true;
break;
default:
if (TypeId::IsExtended(typeId))
{
offsetRange.AdvanceOffset(sizeof(uint8_t)); // Skip the additional second byte.
}
else
{
aMetrics.mReserved = true;
}
break;
}
offsetRange.AdvanceOffset(sizeof(uint8_t));
}
exit:
return error;
}
Status Subject::ConfigureForwardTrackingSeries(uint8_t aSeriesId,
uint8_t aSeriesFlagsMask,
const Metrics &aMetrics,
Neighbor &aNeighbor)
{
Status status = kStatusSuccess;
VerifyOrExit(0 < aSeriesId, status = kStatusOtherError);
if (aSeriesFlagsMask == 0) // Remove the series
{
if (aSeriesId == kSeriesIdAllSeries) // Remove all
{
aNeighbor.RemoveAllForwardTrackingSeriesInfo();
}
else
{
SeriesInfo *seriesInfo = aNeighbor.RemoveForwardTrackingSeriesInfo(aSeriesId);
VerifyOrExit(seriesInfo != nullptr, status = kStatusSeriesIdNotRecognized);
mSeriesInfoPool.Free(*seriesInfo);
}
}
else // Add a new series
{
SeriesInfo *seriesInfo = aNeighbor.GetForwardTrackingSeriesInfo(aSeriesId);
VerifyOrExit(seriesInfo == nullptr, status = kStatusSeriesIdAlreadyRegistered);
seriesInfo = mSeriesInfoPool.Allocate();
VerifyOrExit(seriesInfo != nullptr, status = kStatusCannotSupportNewSeries);
seriesInfo->Init(aSeriesId, aSeriesFlagsMask, aMetrics);
aNeighbor.AddForwardTrackingSeriesInfo(*seriesInfo);
}
exit:
return status;
}
Status Subject::ConfigureEnhAckProbing(uint8_t aEnhAckFlags, const Metrics &aMetrics, Neighbor &aNeighbor)
{
Status status = kStatusSuccess;
Error error = kErrorNone;
VerifyOrExit(!aMetrics.mReserved, status = kStatusOtherError);
if (aEnhAckFlags == kEnhAckRegister)
{
VerifyOrExit(!aMetrics.mPduCount, status = kStatusOtherError);
VerifyOrExit(aMetrics.mLqi || aMetrics.mLinkMargin || aMetrics.mRssi, status = kStatusOtherError);
VerifyOrExit(!(aMetrics.mLqi && aMetrics.mLinkMargin && aMetrics.mRssi), status = kStatusOtherError);
error = Get<Radio>().ConfigureEnhAckProbing(aMetrics, aNeighbor.GetRloc16(), aNeighbor.GetExtAddress());
}
else if (aEnhAckFlags == kEnhAckClear)
{
VerifyOrExit(!aMetrics.mLqi && !aMetrics.mLinkMargin && !aMetrics.mRssi, status = kStatusOtherError);
error = Get<Radio>().ConfigureEnhAckProbing(aMetrics, aNeighbor.GetRloc16(), aNeighbor.GetExtAddress());
}
else
{
status = kStatusOtherError;
}
VerifyOrExit(error == kErrorNone, status = kStatusOtherError);
exit:
return status;
}
#endif // OPENTHREAD_CONFIG_MLE_LINK_METRICS_SUBJECT_ENABLE
uint8_t ScaleLinkMarginToRawValue(uint8_t aLinkMargin)
{
// Linearly scale Link Margin from [0, 130] to [0, 255].
// `kMaxLinkMargin = 130`.
uint16_t value;
value = Min(aLinkMargin, kMaxLinkMargin);
value = value * NumericLimits<uint8_t>::kMax;
value = DivideAndRoundToClosest<uint16_t>(value, kMaxLinkMargin);
return static_cast<uint8_t>(value);
}
uint8_t ScaleRawValueToLinkMargin(uint8_t aRawValue)
{
// Scale back raw value of [0, 255] to Link Margin from [0, 130].
uint16_t value = aRawValue;
value = value * kMaxLinkMargin;
value = DivideAndRoundToClosest<uint16_t>(value, NumericLimits<uint8_t>::kMax);
return static_cast<uint8_t>(value);
}
uint8_t ScaleRssiToRawValue(int8_t aRssi)
{
// Linearly scale RSSI from [-130, 0] to [0, 255].
// `kMinRssi = -130`, `kMaxRssi = 0`.
int32_t value = aRssi;
value = Clamp(value, kMinRssi, kMaxRssi) - kMinRssi;
value = value * NumericLimits<uint8_t>::kMax;
value = DivideAndRoundToClosest<int32_t>(value, kMaxRssi - kMinRssi);
return static_cast<uint8_t>(value);
}
int8_t ScaleRawValueToRssi(uint8_t aRawValue)
{
int32_t value = aRawValue;
value = value * (kMaxRssi - kMinRssi);
value = DivideAndRoundToClosest<int32_t>(value, NumericLimits<uint8_t>::kMax);
value += kMinRssi;
return ClampToInt8(value);
}
} // namespace LinkMetrics
} // namespace ot
#endif // OPENTHREAD_CONFIG_MLE_LINK_METRICS_INITIATOR_ENABLE || OPENTHREAD_CONFIG_MLE_LINK_METRICS_SUBJECT_ENABLE
```
|
This list contains all of the Pyrenean three-thousanders, namely the 129 mountain summits of or more above sea level in the Pyrenees, a range of mountains in southwest Europe that forms a natural border between France and Spain. The Pyrenees extend for about from the Bay of Biscay (Cap Higuer) to the Mediterranean Sea (Cap de Creus). The highest mountain in the Pyrenees is Aneto in Spain at .
The summits meeting the 3,000-metre criterion were defined by a UIAA-sponsored joint Franco-Spanish team led by Juan Buyse. The UIAA list, published in 1990, also contains 83 secondary summits in addition to the 129 principal ones listed here, and divides the range into 11 zones. According to the latest surveys, three of the peaks in the original list are actually below 3000m but are still included below.
The selection criteria used here are quite broad – many of the peaks included are secondary summits of major mountains. Using prominence as a criterion, only one summit is an ultra-prominent peak, Aneto, a further three have a prominence of 1000m (Pico Posets, Pica d'Estats, Vignemale), and five more have a prominence of over 600m. Only 17 in total have a prominence of more than 300m, commonly used as a criterion for determining an independent mountain, and are indicated in bold in the table below. 28 more have a prominence of over 100m and can be considered significant summits.
All the peaks in this list are in Spain (59 peaks) or France (26 peaks), or delimit the border between the two countries (45). The two highest major mountains and their subsidiary summits (Aneto and Posets - Zone 7 and 9) are entirely in Spain, together with the Besiberri peaks (zone 10) while Pic Long and surrounding mountains (zone 5) are entirely in France. Most of the other mountains lie on or close to the border. The small country of Andorra is located in the eastern portion of the Pyrenees and is surrounded by Spain and France; its highest mountain – Coma Pedrosa at – falls below the 3,000-metre threshold. The mountains are listed by height within each of the 11 zones.
Table (incomplete) of Pyrenean 3000m summits
For the complete list see: Pyrenees#Highest summits
Zone 1 : Balaïtous-Enfer-Argualas
Zone 2 : Vignemale
Zone 3 : Monte Perdido
Zone 4 : La Munia
Zone 5 : Néouvielle-Pic Long
Zone 6 : Batoua-Batchimale
Zone 7 : Posets-Eristé
Zone 8 : Clarabide-Perdiguero-Boum
Zone 9 : Maladeta-Aneto
Zone 10 : Besiberris
Zone 11 : Estats-Montcalm
See also
Three-thousanders
List of mountains in Aragon
List of mountains in Catalonia
Peak bagging
References
External links
All of the summits – including secondary summits – contained in the Buyse list (in French)
Bibliography
Pyrenees, three-thousanders
Pyrenees
|
```php
<?php
/*
* This file is part of the Symfony package.
*
* (c) Fabien Potencier <fabien@symfony.com>
*
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*/
namespace Symfony\Component\Console\Input;
/**
* StreamableInputInterface is the interface implemented by all input classes
* that have an input stream.
*
* @author Robin Chalas <robin.chalas@gmail.com>
*/
interface StreamableInputInterface extends InputInterface
{
/**
* Sets the input stream to read from when interacting with the user.
*
* This is mainly useful for testing purpose.
*
* @param resource $stream The input stream
*/
public function setStream($stream);
/**
* Returns the input stream.
*
* @return resource|null
*/
public function getStream();
}
```
|
```php
<?php
/**
* Unit tests covering post mime types.
*
* @ticket 59195
*
* @group post
*
* @covers ::get_available_post_mime_types
*/
class Tests_Post_GetAvailablePostMimeTypes extends WP_UnitTestCase {
public function tear_down() {
// Remove all uploads.
$this->remove_added_uploads();
remove_filter( 'pre_get_available_post_mime_types', array( $this, 'filter_add_null_to_post_mime_types' ) );
parent::tear_down();
}
public function test_should_return_expected_post_mime_types() {
// Upload a JPEG image.
$filename = DIR_TESTDATA . '/images/test-image.jpg';
$contents = file_get_contents( $filename );
$upload = wp_upload_bits( wp_basename( $filename ), null, $contents );
$this->assertEmpty( $upload['error'], 'Uploading a JPEG file should not result in an error.' );
$this->_make_attachment( $upload );
// Upload a PDF file.
$filename = DIR_TESTDATA . '/images/test-alpha.pdf';
$contents = file_get_contents( $filename );
$upload = wp_upload_bits( wp_basename( $filename ), null, $contents );
$this->assertEmpty( $upload['error'], 'Uploading a PDF file should not result in an error.' );
$this->_make_attachment( $upload );
$mime_types = get_available_post_mime_types();
$this->assertSame( array( 'image/jpeg', 'application/pdf' ), $mime_types, 'The MIME types returned should match the uploaded file MIME types.' );
}
public function test_should_remove_null() {
// Add filter to inject null into the mime types array.
add_filter( 'pre_get_available_post_mime_types', array( $this, 'filter_add_null_to_post_mime_types' ) );
$mime_types = get_available_post_mime_types();
$this->assertEqualsCanonicalizing( array( 'image/jpeg', 'image/png' ), $mime_types );
}
/**
* Filter to inject null into the mime types array.
*
* @param string $type Post type.
* @return array
*/
public function filter_add_null_to_post_mime_types( $type ) {
return array( 'image/jpeg', null, 'image/png' );
}
}
```
|
Karl Harko von Noorden (13 September 1858 – 26 October 1944) was a German internist, born in Bonn and educated in medicine at Tübingen, Freiberg, and Leipzig (M.D., 1882).
In 1885 he was admitted as privatdocent to the medical facility of the University of Giessen, where he had been assistant in the medical clinic since 1883. In 1889 he became first assistant of the medical clinic at Berlin University, in 1894 was called to Frankfurt am Main as physician in charge of the municipal hospital, and in 1906 was appointed professor of medicine at the University of Vienna, as a successor to Carl Nothnagel.
Von Noorden made special researches involving albuminuria in health, metabolism disorders and its treatment, diabetes, diseases of the kidney, dietetics, etc., and wrote on these subjects, some of his books appearing in English. Among his assistants was the Austrian-American psychologist Rudolf von Urban.
Noorden advocated an "oat-cure" to treat diabetes. The diet "consisted of 250 gm. oatmeal a day - 80 gm. with about 0.4 liter water each meal and maybe some vegetables or fruits for the taste"
He died in Vienna. His father, also named Carl von Noorden (1833–1883) was a noted historian.
Selected publications
His book on diabetes and its treatment, "Die zuckerkrankheit und ihre behandlung" (1895), was published over numerous editions. Some of his books have been published in English, such as:
"Metabolism and practical medicine", (3 volumes, 1907).
Clinical treatises on the pathology and therapy of disorders of metabolism and nutrition (8 volumes, 1903-09); with Karl Franz Dapper; Hugo Salomon; Hermann Strauss.
New Aspects of Diabetes: Pathology and Treatment, 1913.
Terms
Noorden treatment—Oatmeal treatment. Treatment of diabetes by restricting the protein of the diet and limiting the carbohydrates to oatmeal.
References
1858 births
1944 deaths
Dietitians
German diabetologists
German internists
German emigrants to Austria
German pathologists
German untitled nobility
Physicians from Bonn
Physicians from Vienna
University of Tübingen alumni
Academic staff of the University of Vienna
Members of the Royal Society of Sciences in Uppsala
|
National Route 264 is a national highway of Japan connecting Saga, Saga and Kurume, Fukuoka in Japan, with a total length of
See also
References
National highways in Japan
Roads in Fukuoka Prefecture
Roads in Saga Prefecture
|
Tik Tik Tik is a 2018 Indian Tamil-language science fiction thriller film written and directed by Shakti Soundar Rajan. The film features Jayam Ravi, Aaron Aziz and Nivetha Pethuraj in the lead roles. The film is inspired from the 1998 Michael Bay film Armageddon.
The film was released on 22 June 2018. and received generally positive reviews from critics and audience, who praised Jayam Ravi's performance, cinematography, background score, soundtrack and VFX, but criticized its logic-defying sequences and writing. It grossed over and was a commercial success at the box office.
Plot
The DSD team, which consist of Mahendran as its head, find that an asteroid will hit earth in seven days. The lives of people in and around Chennai are at stake. The only way to destroy the asteroid would be with a heavy missile, but upon learning about the less time they decide to recover the missile through illegal means. The team goes ahead with this secret mission with only the Prime Minister, Home Minister and Defence Minister knows about the mission. The biggest challenge is to take it to space, which will only be possible by expert thieves. An escape artist Vasu and his friends Venkat and Appu are hired. Joining them are DSD members Swathi and Raguram. After training, they board the spacecraft Dhruva 1. Before launch, Vasu hears a mysterious voice on the voice channel in his communication device. The voice tells him that Ravi, Vasu's son, has been kidnapped. In order to release him, Vasu must do whatever the voice tells him.
Dhruva 1 has a successful launch en Route to the space station, the voice tells Vasu to cut some wires, which would cause a fuel leak. He feigns unconsciousness before proceeding to cut the wires. When Venkat questions him, Vasu reveals about Ravi's kidnapping. This causes Dhruva 1 to spiral out of control and crash land on the moon. The crew steps out to do repairs to the ship, where they find the wires cut. The ship is repaired, but the ship has little fuel. Vasu suggests that refueling from the space station to that way, so that they would have an excuse to enter the station. They agree to the plan and request permission to refuel from the space station to which the crew on board the space station approve. Dhruva 1 then leaves the moon, bound for the space station. However, images of Vasu cutting the wires are discovered by Lt. Gen. T. Rithika and brought to Mahendran's attention. However, Mahendran does not pursue the matter further.
Meanwhile, Dhruva 1 successfully docks at the space station, but the crew are arrested by the crew members, under Captain Li Wei. It is revealed that the crew of the station, know the true intentions behind Dhruva 1's mission, and intend to hold them off, so that the asteroid will hit the Bay of Bengal, destroying India, whilst profiting their country, by the major rebuilding projects. While in the space station, Venkat and Appu hack the space station, causing a power outrage, while Vasu obtains Captain Li Wei's fingerprint and iris in order to enter the chamber with the missile. Vasu manages to make the missile disappear without the alarm system triggering. Captain Li Wei realizes, and interrogates them, assaulting the crew physically. Vasu tells that the missile is in Dhruva 1, but when some of the crew follow Vasu, he kills them on the ship, and conveys a message through Swathi that the missile is pointed at the capital of the country owning the missile, and that he will launch the missile at the city if the crew of Dhruva 1 is not released.
Dhruva 1's crew are released and proceed to the ship, while Vasu goes to the chamber holding the missile, where it is revealed that the missile was there the whole time, and steals the missile and exits the ship through space, but hits a piece of the station, puncturing his spacesuit. This causes the oxygen tank to leak. But, he successfully manages to get the missile to Dhruva 1. The ship departs from the space station, and Vasu is revived. Meanwhile, Dhruva 2 arrives to refuel Dhruva 1. It is now known why the wires had to be cut, in order for Dhruva 2 to be mobilised. While the mysterious voice instructs Vasu to hand over the missile to the crew of Dhruva 2, Rithika hears this from Mahendran's communication device, and goes to his office, where Mahendran kills her, revealing that he is the mysterious voice and that he wants to sell the missile on the black market.
Meanwhile, Dhruva 1 successfully refuels and heads to the asteroid to fire the missile, where the crew realised that the missile is missing. Meanwhile, Ravi is released. Mahendran shows the images of Vasu sabotaging the ship and tells that Vasu was the cause of this. The crew realizes that India will be destroyed. However, they reveal that the missile is still on board the spacecraft and is ready to be fired. It is revealed, that Vasu had notified Swathi and Raghuram about the mysterious voice, and that he gave a decoy of the missile to Dhruva 2. The systems on the ship were hacked to reflect the asteroid passing a safety line when it had not. The missile is fired, and the asteroid is split into half, avoiding the earth. However, Dhruva 1 is hit by debris and is about to explode. The crew evacuate the ship in time and reach earth. Ravi reveals the truth about Mahendran as he had grabbed a gold chain belonging to him. Days later, Mahendran presents medals to the Dhruva 1 crew, where Vasu reveals that the whole crew knows about Mahendran, who proceeds to kill himself backstage while Vasu, Appu and Venkat walk away.
Cast
Jayam Ravi as M. Vasudevan, a trained magician and escape artist who is sent to space on a mission to obtain a missile to destroy an asteroid
Aaron Aziz as Captain Lee Wei, a cruel Chinese captain of the space station who refuses to give his missile away
Nivetha Pethuraj as Lt. M. Swathi, an Army official who is also sent on the mission to space
Aarav Ravi as Ravi Vasudevan, Vasu's son who gets kidnapped by Mahendran and would not be released until Mahendran obtains the missile
Ramesh Thilak as S. Venkat, Vasu's friend who is also sent on the mission to space
Arjunan as K. Appu, Vasu's friend who is also sent on the mission to space
Vincent Asokan as Brig. D. Raguram, an Army official and captain of the mission to space
Jayaprakash as Chief K. Mahendran, who suspiciously tells Vasu that he has kidnapped Ravi and will release Ravi if Mahendran obtains the missile
Rethika Srinivas as Lt. Gen. T. Rithika, a high-ranking official who is managing the mission which ends up getting killed by Mahendran
Balaji Venugopal as Team head
Aathma Patrick as Terrorist
Jeeva Ravi as Police officer
Production
Development
After working together in Miruthan (2016), Jayam Ravi was again impressed by a storyline narrated by Shakti Soundar Rajan and agreed to work on another film in March 2016. Jhabak Movies agreed to produce the venture. It is the first Indian film in the space genre. The team began pre-production work thereafter, with Jayam Ravi describing it as the biggest film in his career.
Casting and crew
Actress Nivetha Pethuraj joined the film's cast in September 2016. She was selected due to her knowledge of martial arts. She is trained in jujutsu and kickboxing. Aaron Aziz, a Malaysian-based actor who mostly performed in Malaysian and Singaporean drama and films, was selected as the lead villain, marking his entry into Tamil cinema. Jayam Ravi's son, Aarav, plays the role of his son in this film too.
D. Imman composed the music for this film, continuing his collaboration with the director.
Filming
The team began filming in October 2016 at EVP Film City and Majestic Studio in Chennai. The film is also being shot in Munnar, where the shooting has been halted for a while due to the arrival of forest elephants near the shooting spot. The total duration of VFX scenes is 80 minutes in the film.
The teaser was released on 15 August 2017. The film was initially scheduled to be released on 26 January 2018, but was postponed and eventually released on 22 June 2018.
Music
The film's score and songs were composed by D. Imman. The title track was released on 11 December 2017. The song was sung by Yogi B, Yuvan Shankar Raja and Sunitha Sarathy. The full album was released on 6 January 2018. The album has eight songs, four of which are instrumental songs (two theme songs and two karaoke songs). All the songs were written by Madhan Karky. This is Imman's 100th album. "Kurumba" (Father's Love) in the film featured real photographs and videos from Jayam Ravi's personal family collection.
Reception
Behindwoods rated the album 3 of 5 stars and said, "Imman knocks the ball out of the park for a six to reach his century, and the ball is travelling to space!"
Release
Theatrical
The film was released theatrically on 22 June 2018.
Home media
The satellite rights of the film were sold to Sun TV.
Reception
Box office
Tamil Nadu theatrical rights of the film were sold for 10.5 crore. Tik Tik Tik grossed 3 crore on its first day and 12 crore in first three days in Tamil Nadu. The film collected over in Tamil Nadu in the second weekend. The film collected at the worldwide box office in 11 days. The film collected in United States, in UK and in Australia. It grossed US$45,724 in USA, MYR 2,233,745 in Malaysia, £10,480 in UK, A$34,493 in Australia, and NZ$3,424 in New Zealand in its opening weekend. The film collected US$68,812 in USA, £19,046 in UK, and A$49,176 in Australia by the end of the second weekend. Tik Tik Tik collected £20,888 in UK, and MYR 5,201,436 in Malaysia by the end of the third weekend. The film collected MYR 5,11,504 ( 86.6 lacs) in Malaysia by the end of the fourth weekend.
Critical response
Tik Tik Tik received positive reviews from the critics. On the review aggregator website Rotten Tomatoes reported an approval rating of 67% with an average score of 4.0/10, based on 5 reviews. The website's critical consensus reads, "On the whole, Tik Tik Tik could have been much more considering the newness of its genre, but the attempt is certainly a laudable one."
Thinkal Menon of The Times of India praised the film for the laudable attempt and gave it 3.5 stars out of 5. Sreedhar Pillai of Firstpost praised it as a reasonably entertaining film with a novel concept and gave it 3 out of 5 stars. Manoj Kumar R of The Indian Express said that the concept is not original and gave it 4 out of 5 stars. Sowmya Rajendran of The News Minute praised the film for its impressive VFX scenes. Vikram Venkateswaran of the Quint called the film a harmless entertainer and gave it 3 stars. Sudhir Srinivasan of Cinema Express admired the efforts of the director in this space genre film and gave it 3 stars.
Priyanka Sundar of Hindustan Times called it a film without logic and gave it 1.5 stars. Gautaman Bhaskaran of News18 stated that the film is an unimpressive story, and gave it 1.5 stars. Kirubhakar Purushothaman of India Today called the film a typical underwhelming commercial film set in space, and gave it 1.5 stars. Director Venkat Prabhu and actor Arvind Swami praised the film for the effort. Vishal Menon of The Hindu termed the film simplistic in nature. J. Hurtado of Screen Anarchy noted that the film had many similarities to Armageddon (1998), one of which included the human mission of saving the Earth from an asteroid.
A success celebration was held for the film in Chennai on 29 June 2018.
See also
List of films featuring space stations
References
External links
2018 films
2010s Tamil-language films
2018 science fiction action films
Indian science fiction action films
Indian Army in films
Films about astronauts
Films shot in Chennai
Films scored by D. Imman
Indian Space Research Organisation in fiction
|
Henry Malcolm McHenry (born May 19, 1944) is a professor of anthropology at the University of California, Davis, specializing in studies of human evolution, the origins of bipedality, and paleoanthropology.
McHenry has published on the comparative relationships among primate fossils. His findings have been featured in scholarly journals, and in publications including Science, The New York Times, Discover and National Geographic. McHenry earned bachelor's and master's degrees at UC Davis before earning his Ph.D. at Harvard.
Efficient Walker theory
Attempting to explain the evolutionary advent of bipedalism among hominids, McHenry and Peter Rodman have advanced the Efficient Walker theory, based on energetic analysis. The scientists compared the efficiency of chimpanzees walking on two versus four legs, finding two legged locomotion was far more efficient. They concluded bipedalism was selected simply because it allowed for a further range of travel for hominids. As Miocene forests decreased and hominids were forced into the savannas, the scientists reason, bipedalism enabled greater access to resources.
Study of African ancestors
McHenry travels regularly to Africa to extend his knowledge of human origins, focusing his studies on the fossil remains of australopithecines. The best-known of which are the 3.2-million-year-old remains of 'Lucy', discovered in 1974 by Donald Johanson of the Institute for Human Origins. According to McHenry, "The earlier species (Lucy) is more primitive in its skull and teeth, but has human-like body proportions," whereas "the later species, africanus, with more human-like skull and teeth, has the more ape-like body proportions--big arms, small legs."
Publications
McHenry has produced over 130 publications, comprising papers, reviews, and contributions to books.
Papers
Among the papers which McHenry has written or contributed to are the following:
Books
Among the books which McHenry has written or contributed to are the following:
Book review
References
External links
UCDavis.edu - Henry McHenry's UC Davis homepage
UCDavis.edu - 'Origin of Bipedality', McHenry, H.M., Annual Review of Anthropology, vol 11, p 151-173 (1982)
UCDavis.edu - 'Henry McHenry honored for highly evolved teaching', Lisa Klionsky (March 3, 2000)
UCDavis.edu - 'The singing paleontologist: Back from his latest African visit, Henry McHenry has a bone to pick with an old theory about human evolution', Trina Wood
1944 births
Living people
American anthropologists
Primatologists
Harvard University alumni
University of California, Davis faculty
University of California, Davis alumni
Paleoanthropologists
|
Agelasta ocellifera is a species of beetle in the family Cerambycidae. It was described by John O. Westwood in 1863, originally under the genus Lamia. It is known from the Philippines.
References
ocellifera
Beetles described in 1863
|
Rui Pedro Reis Batalha (born 29 June 1996 in Santo Isidoro, Mafra) is a Portuguese footballer who plays for Real S.C. as a forward.
Football career
On 21 January 2015, Batalha made his professional debut with Gil Vicente in a 2014–15 Taça da Liga match against Marítimo.
References
External links
1996 births
Living people
Footballers from Cascais
Portuguese men's footballers
Men's association football forwards
Gil Vicente F.C. players
S.C.U. Torreense players
Real S.C. players
|
```text
Alternative Names
0
PARAM.SFO
/*
Dynasty Warriors Gundam Reborn
*/
#
Infinite Team Points
0
games24.blog.fc2.com/blog-entry-226.html
0 002E2B44 60000000
#
Infinite Materials
0
games24.blog.fc2.com/blog-entry-226.html
0 001090CC 7D044378
#
Infinite Money
0
games24.blog.fc2.com/blog-entry-226.html
0 0032A344 90A307B8
#
Transformation Back To 0 (Synthesis)
0
games24.blog.fc2.com/blog-entry-226.html
0 00100C84 3BA00000
#
Transformation Back To 0 (Special Equipment)
0
games24.blog.fc2.com/blog-entry-226.html
0 00100580 38800000
#
30,000 Plays, Clears, + Kills
0
games24.blog.fc2.com/blog-entry-226.html
0 00108F0C 39007530
0 00108F1C 38C07530
#
AoB Infinite Team Points
0
games24.blog.fc2.com/blog-entry-226.html
B 00010000 04000000
B 7C9B2010909D003C4BFFE8857C7D0734 60000000909D003C4BFFE8857C7D0734
#
AoB Infinite Materials
0
games24.blog.fc2.com/blog-entry-226.html
B 00010000 04000000
B 7C8540105483043EB08700004E800020 7D0443785483043EB08700004E800020
#
AoB Infinite Money
0
games24.blog.fc2.com/blog-entry-226.html
B 00010000 04000000
B 908307B84E800020F821FF917C0802A6 90A307B84E800020F821FF917C0802A6
#
AoB Transformation Back To 0 (Synthesis)
0
games24.blog.fc2.com/blog-entry-226.html
B 00010000 04000000
B 4081000863A40000989F0004E80100B0 3BA0000063A40000989F0004E80100B0
#
AoB Transformation Back To 0 (Special Equipment)
0
games24.blog.fc2.com/blog-entry-226.html
B 00010000 04000000
B 408100086085000098A300044E800020 388000006085000098A300044E800020
#
AoB 30,000 Plays, Clears, & Kills
0
games24.blog.fc2.com/blog-entry-226.html
B 00010000 04000000
B 7D04182E78C6002078E7002031230008 390003E778C6002078E7002031230008
B 00010000 04000000
B 80C600047D05192E7D2307B490C70004 38C003E77D05192E7D2307B490C70004
#
```
|
Reisholz is an urban quarter of Düsseldorf, part of Borough 9. It is located in the south of the city, bordering Holthausen, Benrath, Hassels and the river Rhine. It has an area of , and 3,753 inhabitants (2020).
Reisholz is an industrial part of the city. Its history started in 1905 by creation of a harbour to the Rhine, a goods station and an industrial area by the Industrie-Terrains Düsseldorf-Reisholz (IDR) company. Many chemical factories, engine building industries, paper mills, petrochemical manufacturers and an oil refinery went to Reisholz.
In 1907 the IDR company built a neogothic church, later demolished and replaced by the expanding Henkel Company.
Reisholz belonged to Benrath and became a part of Düsseldorf together with Benrath in 1929.
In the harbour of Reisholz there are plans for a new business and living area.
References
Urban districts and boroughs of Düsseldorf
|
Nothingface is a self-titled demo album by American metal band Nothingface. It is the first album with lead singer Matt Holt.
The album was initially released in 1995 but was reissued as a download on April 30, 2009, with remastering done by Drew Mazurek (producer for Hellyeah, Linkin Park, GWAR and others)
Track listing
1995 original
"Defaced" 3:15
"Perfect Person" 4:23
"Severed" 4:56
"Useless" 3:51
"Self Punishment" 4:54
"Hitch" 5:50
"Carousel" 4:05
"Deprive" 3:12
"Godkill" 4:06
"Communion" 5:37
Personnel
Matt Holt – vocals
Tom Maxwell – guitar
Bill Gaal – bass
Chris Houck – drums
References
Nothingface albums
demo albums
|
Pasjak () is a village in Croatia, located on the border with Slovenia. Just north of the village is the northern endpoint of the D8 highway, at the eponymous Pasjak border crossing. The village is part of the Matulji municipality.
References
Populated places in Primorje-Gorski Kotar County
Croatia–Slovenia border crossings
|
The Gunn Wållgren Award ( Swedish: Gunn Wållgren-stipendiet) is one of Sweden's theatre awards for young actresses.
It was instituted in the 1980s in memory and honor of notable Swedish stage and film actress Gunn Wållgren(1913–1983).
Grants from the Gunn Wållgren Memorial Fund are jointly awarded by the Royal Dramatic Theater (Kungliga Dramatiska Teatern), the Royal Swedish Opera (Kungliga Operan) and the Royal Swedish Academy of Music (Kungliga Musikaliska Akademien). Awards are given out annually on the anniversary of Wållgren's birthday (November 16) to a young promising Swedish actress of the stage. The prize sum consists of 20,000 Swedish Kronor.
References
Swedish theatre awards
|
Hurricane Condition (HURCON) is an alert scale used by the United States Armed Forces in the North Atlantic and the North Pacific to indicate the state of emergency or preparedness for an approaching hurricane. This designation is especially important to installations in the southern Atlantic region, as it is most affected by hurricanes. In the western Pacific, where hurricanes are referred to as typhoons, the scale is called Tropical Cyclone Condition of Readiness (TCCOR). A HURCON or TCCOR can be issued up to 96 hours before a hurricane is expected to strike the installation.
HURCON conditions
As of 2021, the scale - which has been updated several times, and has minor variations between different military bases - consists of 5 levels, from HURCON 5 to HURCON 1, with three additional sub-levels for HURCON 1. As with civilian alerts, buildings may be boarded up and personnel evacuated. In addition; aircraft, ships, equipment, and other assets will be relocated, tied down, bunkered, or otherwise secured. The contraction was chosen in line with other military terminology in use like DEFCON and FPCON to communicate hazardous conditions.
TCCOR conditions
As of 2021, TCCOR consists of TCCOR 5 to TCCOR 1, three additional sub-levels for TCCOR 1, TCCOR Storm Watch, and TCCOR All Clear.
Notes
It is possible to return to TCCOR Storm Watch or TCCOR 4 from a higher level of alert if the storm is no longer forecast to reach destructive wind criteria at the installation.
Destructive wind criteria: 50 knots sustained or gust factors of 60 knots or greater.
References
Weather warnings and advisories
Alert measurement systems
Military terminology of the United States
|
Eion was a city of ancient Macedonia.
Eion may also refer to:
Eion (given name)
Eion (Argolis), a town of ancient Argolis
Eion (Pieria), a town of ancient Pieria
Eion (Thrace), a town of ancient Thrace
|
Theoctistus or Theoktistos () is a Greek name derived from θεος theos, "god", and κτίσμα ktisma, "creation, edifice, foundation", the resulting combination being translated to "creation of God", "godly creation".
Theoctistus or Theoktistos can refer to, chronologically:
Theoctistus of Caesarea (2nd-3rd centuries), bishop; see Origen
Theoctistus of Alexandria (3rd century), a sea captain, martyr, saint, and companion of Faustus, Abibus and Dionysius of Alexandria (martyred 250)
Theoctistus of Palestine (died 451 or 467), aka Venerable Theoctistus (Theoktistos) of Palestine, Byzantine monk, hermit and Orthodox saint, active in Palestine, companion of Saint Euthymius the Great with whom he established a monastery, commemorated on September 3
Monk Theoktistos (died 800), the hegumen of Sicilian Kucuma, commemorated as an Eastern Orthodox saint.
Theoktistos (magistros) (fl. 802–821), senior Byzantine official
Theoctistus of Naples, Duke of Naples in 818-821
Theoktistos Bryennios (fl. ca. 842), Byzantine general
Theoktistos, chief minister and regent of the Byzantine Empire from 842 to 855
Theoktistos the Stoudite, 14th-century Byzantine ecclesiastical writer
Teoctist I of Moldavia (ca. 1410-1477), Metropolitan of Moldavia from 1453 to 1477
Teoctist Arăpaşu (born Toader Arăpaşu; 1915-2007), Patriarch of the Romanian Orthodox Church from 1986 to 2007
Teoctist Blajevici (1807-1879), Metropolitan of Bukovina and Dalmatia
|
Galemont is a historic home located at Broad Run, Fauquier County, Virginia. It was built between 1778 and 1817, as a -story, two-room, stone hall-and-parlor-plan residence with a one-room cellar. It was expanded about 1857, and included Federal / Greek Revival-style details. In 1872, a new I-plan house was built less than 20 feet east of the original house, and connected to make one large, multi-period building with transverse center halls. It was further enlarged in 1903, with a connection to the stone kitchen and two-story wing. The later additions added a Folk Victorian style to the house. Also on the property are the contributing garage, silo, old shed, pond, a fieldstone wall (c. 1824), and two archaeological sites: the 1852 Broad Run Train Depot site and an intact segment of the Thoroughfare Gap Road.
It was listed on the National Register of Historic Places in 2012.
References
Houses on the National Register of Historic Places in Virginia
Victorian architecture in Virginia
Federal architecture in Virginia
Greek Revival houses in Virginia
Houses completed in 1817
Houses in Fauquier County, Virginia
National Register of Historic Places in Fauquier County, Virginia
|
```objective-c
/**
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree. An additional grant
* of patent rights can be found in the PATENTS file in the same directory.
*/
#import "AppDelegate.h"
#import "RCTBundleURLProvider.h"
#import "RCTRootView.h"
@implementation AppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
NSURL *jsCodeLocation;
jsCodeLocation = [[RCTBundleURLProvider sharedSettings] jsBundleURLForBundleRoot:@"index.ios" fallbackResource:nil];
RCTRootView *rootView = [[RCTRootView alloc] initWithBundleURL:jsCodeLocation
moduleName:@"ShopReactNative"
initialProperties:nil
launchOptions:launchOptions];
rootView.backgroundColor = [[UIColor alloc] initWithRed:1.0f green:1.0f blue:1.0f alpha:1];
self.window = [[UIWindow alloc] initWithFrame:[UIScreen mainScreen].bounds];
UIViewController *rootViewController = [UIViewController new];
rootViewController.view = rootView;
self.window.rootViewController = rootViewController;
[self.window makeKeyAndVisible];
return YES;
}
@end
```
|
```javascript
(function ($) {
$.extend($.summernote.lang, {
'da-DK': {
font: {
bold: 'Fed',
italic: 'Kursiv',
underline: 'Understreget',
strikethrough: 'Genemstreget',
clear: 'Fjern formatering',
height: 'Hjde',
size: 'Skriftstrrelse'
},
image: {
image: 'Billede',
insert: 'Indst billede',
resizeFull: 'Original strrelse',
resizeHalf: 'Halv strrelse',
resizeQuarter: 'Kvart strrelse',
floatLeft: 'Venstrestillet',
floatRight: 'Hjrestillet',
floatNone: 'Fjern formatering',
dragImageHere: 'Trk billede hertil',
selectFromFiles: 'Vlg billed-fil',
url: 'Billede URL',
remove: 'Fjern billede'
},
link: {
link: 'Link',
insert: 'Indst link',
unlink: 'Fjern link',
edit: 'Rediger',
textToDisplay: 'Visningstekst',
url: 'Hvor skal linket pege hen?',
openInNewWindow: 'bn i nyt vindue'
},
video: {
video: 'Video',
videoLink: 'Video Link',
insert: 'Indst Video',
url: 'Video URL?',
providers: '(YouTube, Vimeo, Vine, Instagram, or DailyMotion)'
},
table: {
table: 'Tabel'
},
hr: {
insert: 'Indst horisontal linje'
},
style: {
style: 'Stil',
normal: 'Normal',
blockquote: 'Citat',
pre: 'Kode',
h1: 'Overskrift 1',
h2: 'Overskrift 2',
h3: 'Overskrift 3',
h4: 'Overskrift 4',
h5: 'Overskrift 5',
h6: 'Overskrift 6'
},
lists: {
unordered: 'Punktopstillet liste',
ordered: 'Nummereret liste'
},
options: {
help: 'Hjlp',
fullscreen: 'Fuld skrm',
codeview: 'HTML-Visning'
},
paragraph: {
paragraph: 'Afsnit',
outdent: 'Formindsk indryk',
indent: 'Forg indryk',
left: 'Venstrestillet',
center: 'Centreret',
right: 'Hjrestillet',
justify: 'Blokjuster'
},
color: {
recent: 'Nyligt valgt farve',
more: 'Flere farver',
background: 'Baggrund',
foreground: 'Forgrund',
transparent: 'Transparetn',
setTransparent: 'St transparent',
reset: 'Nulstil',
resetToDefault: 'Gendan standardindstillinger'
},
shortcut: {
shortcuts: 'Genveje',
close: 'Luk',
textFormatting: 'Tekstformatering',
action: 'Handling',
paragraphFormatting: 'Afsnitsformatering',
documentStyle: 'Dokumentstil'
},
history: {
undo: 'Fortryd',
redo: 'Anuller fortryd'
}
}
});
})(jQuery);
```
|
A thyrsus is a staff of giant fennel covered with ivy vines and leaves.
Thyrsus may also refer to:
Thyrsus (grasshopper), a genus of grasshopper in the family Tetrigidae
Thyrsus (giant) a mythical figure from Austria
Saint Thyrsus (died 251), Christian martyr
Thyrsus (Mage: the Awakening), a Mage character
Thyrsus González de Santalla (1624-1705), Spanish theologian
See also
Thyrsis (disambiguation)
Thyrse, a flowering plant structure called thyrsus in botanical Latin
Masculine given names
pl:Tyrs
|
```scss
@media (max-width: 768px) {
.navbar-toggle {
position:absolute;
z-index: 9999;
left:0px;
top:0px;
}
.navbar a.navbar-brand {
display: block;
margin: 0 auto 0 auto;
width: 148px;
height: 50px;
float: none;
background: url("../images/spring-logo-dataflow-mobile.png") 0 center no-repeat;
}
.homepage-billboard .homepage-subtitle {
font-size: 21px;
line-height: 21px;
}
.navbar a.navbar-brand span {
display: none;
}
.navbar {
border-top-width: 0;
}
.xd-container {
margin-top: 20px;
margin-bottom: 30px;
}
.index-page--subtitle {
margin-top: 10px;
margin-bottom: 30px;
}
}
```
|
```xml
/*
* MTCore.mm
*
*/
#include "MTCore.h"
#include <string>
#include <stdexcept>
namespace LLGL
{
void MTThrowIfFailed(NSError* error, const char* info)
{
if (error != nullptr)
{
std::string s = info;
s += ": ";
NSString* errorMsg = [error localizedDescription];
s += [errorMsg cStringUsingEncoding:NSUTF8StringEncoding];
throw std::runtime_error(s);
}
}
void MTThrowIfCreateFailed(NSError* error, const char* interfaceName, const char* contextInfo)
{
if (error != nullptr)
{
std::string s;
{
s = "failed to create instance of <";
s += interfaceName;
s += '>';
if (contextInfo != nullptr)
{
s += ' ';
s += contextInfo;
}
}
MTThrowIfFailed(error, s.c_str());
}
}
BOOL MTBoolean(bool value)
{
return (value ? YES : NO);
}
} // /namespace LLGL
// ================================================================================
```
|
Rhenium dioxide trfluoride is an inorganic compound with the formula . A white diamagnetic solid, it one of the few oxyfluorides of rhenium, another being Rhenium trioxide fluoride, . The material is of some academinc interest as a rare example of an dioxide trifluoride. It can be prepared by the reaction of xenon difluoride and rhenium trioxide chloride:
According to X-ray crystallography, the compound can exist in four polymorphs. Two polymorphs adopt chain-like structures featuring octahedral Re centers linked by [bridging bridging fluoride]]s. Two other polymorphs adopt cyclic structures and , again featuring octahedral Re centers and bridging fluorides. Like related oxyfluorides, these coordination oligomers break up in the presence of Lewis bases. Adducts of the formula where L = acetonitrile have been crystallized.
References
Rhenium compounds
Fluorides
Transition metal oxides
|
```kotlin
package net.corda.serialization.internal.verifier
import net.corda.core.crypto.SecureHash
import net.corda.core.internal.concurrent.openFuture
import net.corda.serialization.internal.verifier.ExternalVerifierOutbound.VerifierRequest.GetAttachments
import net.corda.testing.core.SerializationEnvironmentRule
import org.assertj.core.api.Assertions.assertThat
import org.junit.Rule
import org.junit.Test
import java.net.InetSocketAddress
import java.nio.channels.ServerSocketChannel
import java.nio.channels.SocketChannel
import kotlin.concurrent.thread
class ExternalVerifierTypesTest {
@get:Rule
val testSerialization = SerializationEnvironmentRule()
@Test(timeout=300_000)
fun `socket channel read-write`() {
val payload = GetAttachments(setOf(SecureHash.randomSHA256(), SecureHash.randomSHA256()))
val serverChannel = ServerSocketChannel.open()
serverChannel.bind(null)
val future = openFuture<GetAttachments>()
thread {
SocketChannel.open().use {
it.connect(InetSocketAddress(serverChannel.socket().localPort))
val received = it.readCordaSerializable(GetAttachments::class)
future.set(received)
}
}
serverChannel.use { it.accept().writeCordaSerializable(payload) }
assertThat(future.get()).isEqualTo(payload)
}
}
```
|
The Quartzites et Poudingues de Trémentines is a geologic formation in France. It preserves fossils dating back to the Cambrian period.
See also
List of fossiliferous stratigraphic units in France
References
Geologic formations of France
Cambrian System of Europe
Cambrian France
Quartzite formations
Formations
|
Walt Disney's Treasury of Classic Tales is an American Disney comic strip, which ran on Sundays in newspapers from July 13, 1952, until February 15, 1987. It was distributed by King Features Syndicate. Each story adapted a different Disney film, such as Darby O'Gill and the Little People, Peter Pan, or Davy Crockett. It was run in relatively few papers, with 58 in 1957 and 55 in 1966, and was principally a vehicle for promoting new and re-released Disney films.
Publication history
From March 8 to June 18, 1950, Disney distributed a limited-time Sunday strip adaptations of their new animated feature Cinderella, written by Frank Reilly, with art by Manuel Gonzales and Dick Moores. The same team followed the next year with Alice in Wonderland, which ran from September 2 to December 16, 1951. Judged a success, the experiment was turned into an ongoing feature in 1952, beginning with The Story of Robin Hood.
The strip featured a wide variety of Disney stories. The animated features adapted for the strip include Peter Pan (1953), Lady and the Tramp (1955), Sleeping Beauty (1958), The Sword in the Stone (1963) and The Jungle Book (1968). Classic Tales also featured animated shorts, including Lambert the Sheepish Lion (1956) and Ben and Me (1953), and featurettes like Peter & The Wolf (1954) and Winnie the Pooh and the Honey Tree (1966).
Treasury of Classic Tales also adapted live-action films like Old Yeller (1957–58), Swiss Family Robinson (1960), Mary Poppins (1964) and The Love Bug (1969). The strip transitioned from historical dramas like The Sword and the Rose (1953) and Kidnapped (1960) to comedies like The Shaggy Dog (1959) and The Parent Trap (1961).
The 1979–80 adaptation of The Black Hole was particularly notable for featuring pencil art by comics icon Jack Kirby, with Mike Royer inking.
Some of the stories created toward the end of the strip's run in the 1980s were original stories featuring characters from different Disney animated movies, including The Return of the Rescuers (1983), Dumbo, the Substitute Stork (1984) and Cinderella: Bibbidi-Bobbodi-Who? (1984).
Most stories ran for thirteen weeks. A total of 129 stories were created between 1952 and 1987.
List of story titles
{| class="wikitable" margin:auto;"
|-
! Title
!Year
! Dates
! Writing
! Art
! INDUCKSlink
|-
| The Story of Robin Hood
|1952
| July 13 – Dec 28
| rowspan="84" | Frank Reilly
| Jesse Marsh
| ToCT 1
|-
| Peter Pan
| rowspan="3" |1953
| Jan 4 – June 14
| Manuel Gonzales& Dick Moores
| ToCT 2
|-
| The Sword and the Rose
| June 21 – Oct 25
| Jesse Marsh
| ToCT 3
|-
| Ben and Me
| Nov 1 – Dec 27
| Manuel Gonzales& Dick Moores
| ToCT 4
|-
| Rob Roy
| rowspan="3" |1954
| Jan 3 – May 30
| Jesse Marsh
| ToCT 5
|-
| Peter & The Wolf
| June 6 – July 25
| Manuel Gonzales& Dick Moores
| ToCT 6
|-
| 20,000 Leagues Under the Sea
| Aug 1 – Dec 26
| Jesse Marsh
| ToCT 7
|-
| Lady and the Tramp
|1955
| Jan 2 – July 10
| Manuel Gonzales& Dick Moores
| ToCT 8
|-
| The Legends of Davy Crockett
| rowspan="4" |1956
| July 7, 1955 – Jan 8
| rowspan="3" | Jesse Marsh
| ToCT 9
|-
| The Littlest Outlaw
| Jan 15 – March 26
| ToCT 10
|-
| The Great Locomotive Chase
| April 2 – July 29
| ToCT 11
|-
| Lambert the Sheepish Lion
| Aug 5 – Sep 30
| Floyd Gottfredson
| ToCT 12
|-
| Westward Ho, the Wagons!'
|1956–1957
| Oct 7 – Jan 27
| Jesse Marsh
| ToCT 13
|-
| Gus & Jaq| rowspan="3" |1957
| Feb 3 – March 31
| Ken Hultgren
| ToCT 14
|-
| Johnny Tremain| Apr 7 – June 30
| rowspan="3" | Jesse Marsh
| ToCT 15
|-
| Perri| July 7 – Nov 24
| ToCT 16
|-
| Old Yeller|1957–1958
| Dec 1 – Feb 23
| ToCT 17
|-
| The Seven Dwarfs & The Witch Queen| rowspan="3" |1958
| March 2 – Apr 27
| Julius Svedsen
| ToCT 18
|-
| The Light in the Forest| May 4 – July 27
| Jesse Marsh
| ToCT 19
|-
| Sleeping Beauty| Aug 3 – Dec 28
| Julius Svedsen
| ToCT 20
|-
| The Shaggy Dog| rowspan="3" |1959
| Jan 4 – Apr 26
| rowspan="7" | Jesse Marsh
| ToCT 21
|-
| Darby O'Gill and the Little People| May 3 – Aug 3
| ToCT 22
|-
| Third Man on the Mountain| Sept 6 – Dec 27
| ToCT 23
|-
| Toby Tyler| rowspan="4" |1960
| Jan 3 – Mar 27
| ToCT 24
|-
| Kidnapped| Apr 3 – June 26
| ToCT 25
|-
| Pollyanna| July 3 – Sep 25
| ToCT 26
|-
| Swiss Family Robinson| Oct 2 – Dec 25
| ToCT 27
|-
| 101 Dalmatians| rowspan="4" |1961
| Jan 1 – Mar 26
| Bill Wright & Chuck Fuson,
Manuel Gonzales & Floyd Gottfredson
| ToCT 28
|-
| Nikki, Wild Dog of the North| Apr 2 – June 25
| rowspan="2" | Jesse Marsh
| ToCT 29
|-
| The Parent Trap| July 2 – Sep 25
| ToCT 30
|-
| Babes in Toyland| Oct 1 – Dec 31
| Joseph Hale
| ToCT 31
|-
| Moon Pilot| rowspan="4" |1962
| Jan 7 – Mar 25
| Jesse Marsh
| ToCT 32
|-
| Bon Voyage!| Apr 1 – June 24
| John Ushler
| ToCT 33
|-
| Big Red
| July 1 – Sep 30
| Jesse Marsh
| ToCT 34
|-
| In Search of the Castaways
| Oct 7 – Dec 30
| rowspan="42" | John Ushler
| ToCT 35
|-
| Son of Flubber| rowspan="4" |1963
| Jan 6 – Mar 31
| ToCT 36
|-
| Miracle of the White Stallions| Apr 7 – June 30
| ToCT 37
|-
| Savage Sam| July 7 – Sep 29
| ToCT 38
|-
| The Sword in the Stone| Oct 6 – Dec 29
| ToCT 39
|-
| A Tiger Walks| rowspan="4" |1964
| Jan 5 – March 29
| ToCT 40
|-
| The Three Lives of Thomasina| Apr 6 – June 28
| ToCT 41
|-
| The Moon-Spinners
| July 5 – Sep 27
| ToCT 42
|-
| Mary Poppins| Oct 4 – Dec 27
| ToCT 43
|-
| Those Calloways| rowspan="4" |1965
| Jan 3 – Mar 28
| ToCT 44
|-
| The Monkey's Uncle| Apr 4 – June 27
| ToCT 45
|-
| Dumbo| July 4 – Sep 26
| ToCT 46
|-
| That Darn Cat!| Oct 3 – Dec 26
| ToCT 47
|-
| Winnie the Pooh and the Honey Tree| rowspan="4" |1966
| Jan 2 – Mar 27
| ToCT 48
|-
| Lt. Robin Crusoe, U.S.N.
| Apr 3 – June 26
| ToCT 49
|-
| The Fighting Prince of Donegal| July 3 – Sep 25
| ToCT 50
|-
| Follow Me, Boys!| Oct 2 – Nov 27
| ToCT 51
|-
| Monkeys, Go Home!|1966–1967
| Dec 4 – Jan 29
| ToCT 52
|-
| The Adventures of Bullwhip Griffin| rowspan="3" |1967
| Feb 5 – Apr 30
| ToCT 53
|-
| The Gnome-Mobile| May 7 – July 30
| ToCT 54
|-
| The Happiest Millionaire
| Aug 6 – Oct 29
| ToCT 55
|-
| The Jungle Book|1967–1968
| Nov 5 – Jan 28
| ToCT 56
|-
| Blackbeard's Ghost| rowspan="4" |1968
| Feb 4 – Apr 28
| ToCT 57
|-
| Never A Dull Moment| May 6 – July 28
| ToCT 58
|-
| Winnie the Pooh and the Blustery Day| Aug 4 – Sep 29
| ToCT 59
|-
| The Horse in the Gray Flannel Suit| Oct 6 – Dec 29
| ToCT 60
|-
| Smith!| rowspan="4" |1969
| Jan 5 – Feb 23
| ToCT 61
|-
| The Love Bug| Mar 2 – May 25
| ToCT 62
|-
| Hang Your Hat on the Wind!| June 1 – Aug 31
| ToCT 63
|-
| My Dog, The Thief| Sept 7 – Nov 30
| ToCT 64
|-
| The Computer Wore Tennis Shoes|1969–1970
| Dec 7 – Feb 22
| ToCT 65
|-
| King of the Grizzlies| rowspan="3" |1970
| Mar 1 – May 31
| ToCT 66
|-
| The Boatniks| June 7 – Aug 30
| ToCT 67
|-
| The Aristocats| Sept 6 – Dec 27
| ToCT 68
|-
| The Barefoot Executive| rowspan="4" |1971
| Jan 3 – Mar 28
| ToCT 69
|-
| The Million Dollar Duck| Apr 4 – June 27
| ToCT 70
|-
| Bedknobs and Broomsticks| July 4 – Oct 31
| ToCT 71
|-
| The Living Desert| Nov 7 – Dec 26
| ToCT 72
|-
| Napoleon and Samantha| rowspan="4" |1972
| Jan 2 – Mar 26
| ToCT 73
|-
| Now You See Him, Now You Don't| Apr 2 – June 25
| ToCT 74
|-
| The Legend of Lobo| July 2 – Sep 24
| ToCT 75
|-
| Snowball Express
| Oct 1 – Dec 31
| ToCT 76
|-
| The World's Greatest Athlete| rowspan="3" |1973
| Jan 7 – Mar 26
| rowspan="15" | Mike Arens
| ToCT 77
|-
| Cinderella| Apr 1 – June 24
| ToCT 78
|-
| One Little Indian| July 1 – Sep 30
| ToCT 79
|-
| Robin Hood|1973–1974
| Oct 7 – Jan 27
| ToCT 80
|-
| Alice in Wonderland| rowspan="3" |1974
| Feb 3 – Apr 28
| ToCT 81
|-
| Herbie Rides Again| May 5 – July 28
| ToCT 82
|-
| The Bears and I| Aug 4 – Sep 29
| ToCT 83
|-
| The Island at the Top of the World|1974–1975
| October 6, 1974 – January 26, 1975
| ToCT 84
|-
| Escape to Witch Mountain
| rowspan="4" |1975
| Feb 3 – Apr 27
| Carl Fallberg
| ToCT 85
|-
| The Apple Dumpling Gang| May 4 – June 29
| rowspan="3" | Frank Reilly
| ToCT 86
|-
| One of Our Dinosaurs Is Missing| July 6 – Sep 28
| ToCT 87
|-
| Winnie the Pooh and Tigger Too
| Oct 5 – Nov 30
| ToCT 88
|-
| No Deposit, No Return|1975–1976
| December 7, 1975 – February 29, 1976
| Frank Reilly & Carl Fallberg
| ToCT 89
|-
| Gus| rowspan="3" |1976
| Mar 7 – May 30
| rowspan="3" | Carl Fallberg
| ToCT 90
|-
| Treasure of Matecumbe| June 6 – Aug 29
| ToCT 91
|-
| The Shaggy D.A.| Sept 5 – Nov 28
| rowspan="12" | Richard Moore
| ToCT 92
|-
| Freaky Friday|1976–1977
| Dec 5 – Feb 27
| Al Stoffel
| ToCT 93
|-
| The Rescuers| rowspan="3" |1977
| Mar 6 – May 29
| Carl Fallberg
| ToCT 94
|-
| Herbie Goes to Monte Carlo| June 5 – Aug 28
| Al Stoffel
| ToCT 95
|-
| Pete's Dragon| Sept 4 – Nov 27
| Carl Fallberg
| ToCT 96
|-
| Candleshoe|1977–1978
| December 4, 1977 – Feb 26
| Carl Fallberg & Al Stoffel
| ToCT 97
|-
| The Cat from Outer Space| rowspan="3" |1978
| Mar 5 – May 28
| Carl Fallberg
| ToCT 98
|-
| Hot Lead and Cold Feet| June 4 – Aug 27
| Al Stoffel
| ToCT 99
|-
| Pinocchio| Sept 3 – Nov 26
| rowspan="2" | Carl Fallberg
| ToCT 100
|-
| The North Avenue Irregulars|1978–1979
| Dec 3 – Feb 25
| ToCT 101
|-
| The Apple Dumpling Gang Rides Again| rowspan="2" |1979
| Mar 4 – May 27
| rowspan="2" | Al Stoffel
| ToCT 102
|-
| Unidentified Flying Oddball| June 3 – Aug 26
| ToCT 103
|-
| The Black Hole|1979–1980
| Sept 2 – Feb 24
| rowspan="2" | Carl Fallberg
| Jack Kirby & Mike Royer
| ToCT 104
|-
| The Watcher in the Woods| rowspan="3" |1980
| Mar 2 – May 25
| rowspan="3" | Richard Moore
| ToCT 105
|-
| The Last Flight of Noah's Ark| June 1 – Aug 24
| rowspan="2" | Al Stoffel
| ToCT 106
|-
| The Devil and Max Devlin| Aug 31 – Nov 23
| ToCT 107
|-
| Condorman|1980–1981
| Nov 30 – Apr 12
| Greg Crosby
| Russ Heath
| ToCT 108
|-
| The Fox and the Hound|1981
| Apr 19 – Aug 30
| rowspan="4" | Jeannette Steiner
| rowspan="4" | Richard Moore
| ToCT 109
|-
| Night Crossing|1981–1982
| Sept 6 – Jan 17
| ToCT 110
|-
| Tron|1982
| Jan 24 – June 6
| ToCT 111
|-
| Tex| rowspan="2" |1982
| June 13 – Sep 26
| ToCT 112
|-
| Mickey's Christmas Carol| Oct 3 – Dec 26
| rowspan="3" | Carl Fallberg
| Richard Moore & Frank Johnson
| ToCT 113
|-
| Ferdinand the Bull & The Robbers| rowspan="4" |1983
| Jan 2 – Mar 6
| rowspan="16" | Richard Moore
| ToCT 114
|-
| Snow White and the Seven Dwarfs| Mar 13 – June 26
| ToCT 115
|-
| The Adventures of Mr. Toad| July 3 – Sep 25
| rowspan="5" | Tom Yakutis
| ToCT 116
|-
| The Return of the Rescuers| Oct 2 – Dec 25
| ToCT 117
|-
| Dumbo, the Substitute Stork| rowspan="4" |1984
| Jan 1 – Mar 25
| ToCT 118
|-
| Robin Hood in: Rich John, Poor John| Apr 1 – June 24
| ToCT 119
|-
| Cinderella: Bibbidi-Bobbodi-Who?| July 1 – Sep 23
| ToCT 120
|-
| Pinocchio & Jiminy Cricket: A Coat Tale| Sept 30 – Dec 30
| Carl Fallberg
| ToCT 121
|-
| Black Arrow| rowspan="3" |1985
| Jan 6 – Mar 31
| Tom Yakutis
| ToCT 122
|-
| Return to Oz| Apr 7 – July 14
| Carl Fallberg
| ToCT 123
|-
| The Black Cauldron| July 21 – Oct 27
| Tom Yakutis
| ToCT 124
|-
| The Journey of Natty Gann|1985–1986
| Nov 3 – Jan 26
| Don Dougherty
| ToCT 125
|-
| The Search For Sleeping Beauty| rowspan="3" |1986
| Feb 2 – Apr 27
| rowspan="4" | Carl Fallberg
| ToCT 126
|-
| The Great Mouse Detective| May 4 – July 27
| ToCT 127
|-
| Song of the South| Aug 3 – Nov 16
| ToCT 128
|-
| Tramp's Cat-astrophe|1986–1987
| Nov 23 – Feb 15
| ToCT 129
|-
|}
Reprintings
In 2016, IDW Publishing and their imprint The Library of American Comics (LoAC) began to collect all the Treasury of Classic Tales stories in a definitive hardcover reprint series. As of 2019, three volumes have been published, reprinting all the stories from Robin Hood (1952) through In Search of the Castaways'' (1962). In April 2018, it was announced that, due to the sales goal of the series not being met, the third volume may be the last one to be published.
Notes and references
External links
Walt Disney's Treasury of Classic Tales (samples of 1970s strips)
INDUCKS database of credits for Treasury of Classic Tales
American comic strips
1952 comics debuts
1987 comics endings
Disney comic strips
Comics based on novels
Comics based on films
Comics based on fairy tales
|
A Strangely Isolated Place is the second studio album by German electronic musician Ulrich Schnauss, released on 9 June 2003 by City Centre Offices. It was released in the United States on 5 October 2004 by Domino Recording Company.
On 13 October 2008, a remastered edition of A Strangely Isolated Place was issued by Independiente. The album was remastered again in 2019 for a new reissue, which was released on 17 April 2020 by Scripted Realities.
Critical reception
Resident Advisor named A Strangely Isolated Place the 37th best album of the 2000s, describing it as "ambient music with enough oomph to keep the club kids happy." In 2016, Pitchfork ranked the record at number 31 on its list of the 50 best shoegaze albums of all time.
Track listing
Personnel
Credits are adapted from the album's liner notes.
Ulrich Schnauss – production
Aesthetic Investments – cover design
Judith Beck – vocals
Paul Davis – guitar
Markus Knothe – photography
Loop-O – mastering
References
External links
2003 albums
Domino Recording Company albums
Ulrich Schnauss albums
|
Outsports is a sports news website concerned with LGBT issues and personalities in amateur and professional sports. The company was founded in 1999 by Cyd Zeigler, Jr. and Jim Buzinski.
The Outsports Revolution (Alyson Publications), by Zeigler and Buzinski, was released in 2007. The book chronicles the development of the Outsports.com brand and its impact on the world of gay sports, covers the gay sports movement, introduces both famous and non-famous LGBT athletes, and examines various myths and controversies regarding gays and sports.
The site received the 2003 National Lesbian and Gay Journalists Association's Excellence in New Media Journalism Award.
Outsports was purchased by Vox Media in 2013. Buzinski and Zeigler retained editorial control and continue to operate the site as part of its sports blog network SB Nation. In spring 2023, Zeigler was criticized for his public transition to the Republican Party and for his endorsement of Governor Ron DeSantis for the 2024 United States presidential election.
See also
Athlete Ally
References
External links
Outsports.com
LGBT-related websites
LGBT sports organizations in the United States
SB Nation
|
```python
__author__ = "saeedamen" # Saeed Amen
#
#
#
# Unless required by applicable law or agreed to in writing, software
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#
#
import abc
import copy
from findatapy.market.marketdatarequest import MarketDataRequest
from findatapy.util import ConfigManager, LoggerManager
class DataVendor(object):
"""Abstract class for various data source loaders.
"""
def __init__(self):
self.config = ConfigManager().get_instance()
# self.config = None
return
@abc.abstractmethod
def load_ticker(self, md_request):
"""Retrieves market data from external data source
Parameters
----------
md_request : MarketDataRequest
contains all the various parameters detailing time series start and
finish, tickers etc
Returns
-------
DataFrame
"""
return
# to be implemented by subclasses
@abc.abstractmethod
def kill_session(self):
return
def construct_vendor_md_request(self, md_request,
fill_vendors_tickers_only=False):
"""Creates a MarketDataRequest with the vendor tickers
Parameters
----------
md_request : MarketDataRequest
contains all the various parameters detailing time series start and
finish, tickers etc
fill_vendors_tickers_only : bool
Only search for vendors tickers (and ignore fields etc.)
Returns
-------
MarketDataRequest
"""
md_request_vendor = MarketDataRequest(
md_request=md_request)
md_request_vendor.tickers = self.translate_to_vendor_ticker(md_request)
if not fill_vendors_tickers_only:
md_request_vendor.fields = \
self.translate_to_vendor_field(md_request)
md_request_vendor.old_tickers = \
md_request.tickers
return md_request_vendor
def translate_to_vendor_field(self, md_request):
"""Converts all the fields from findatapy fields to vendor fields
Parameters
----------
md_request : MarketDataRequest
contains all the various parameters detailing time series start
and finish, tickers etc
Returns
-------
List of Strings
"""
if md_request.vendor_fields is not None:
return md_request.vendor_fields
source = md_request.data_source
fields_list = md_request.fields
if isinstance(fields_list, str):
fields_list = [fields_list]
if self.config is None: return fields_list
fields_converted = []
for field in fields_list:
try:
f = self.config.convert_library_to_vendor_field(source, field)
except:
logger = LoggerManager().getLogger(__name__)
logger.warn(
"Couldn't find field conversion, "
"did you type it correctly: " + field)
return
fields_converted.append(f)
return fields_converted
# Translate findatapy ticker to vendor ticker
def translate_to_vendor_ticker(self, md_request):
"""Converts all the tickers from findatapy tickers to vendor tickers
Parameters
----------
md_request : MarketDataRequest
contains all the various parameters detailing time series start
and finish, tickers etc
Returns
-------
List of Strings
"""
if md_request.vendor_tickers is not None:
return md_request.vendor_tickers
category = md_request.category
source = md_request.data_source
freq = md_request.freq
cut = md_request.cut
tickers_list = md_request.tickers
if isinstance(tickers_list, str):
tickers_list = [tickers_list]
if self.config is None: return tickers_list
tickers_list_converted = []
for ticker in tickers_list:
try:
t = self.config.convert_library_to_vendor_ticker(category,
source, freq,
cut, ticker)
except:
logger = LoggerManager().getLogger(__name__)
logger.error(
"Couldn't find ticker conversion, did you type "
"it correctly: " + ticker)
return
tickers_list_converted.append(t)
return tickers_list_converted
def translate_from_vendor_field(self, vendor_fields_list,
md_request):
"""Converts all the fields from vendors fields to findatapy fields
Parameters
----------
md_request : MarketDataRequest
contains all the various parameters detailing time series start
and finish, tickers etc
Returns
-------
List of Strings
"""
data_source = md_request.data_source
if isinstance(vendor_fields_list, str):
vendor_fields_list = [vendor_fields_list]
# if self.config is None: return vendor_fields_list
fields_converted = []
# If we haven't set the configuration files for automatic configuration
if md_request.vendor_fields is not None:
dictionary = dict(zip(self.get_lower_case_list(
md_request.vendor_fields),
md_request.fields))
for vendor_field in vendor_fields_list:
try:
fields_converted.append(dictionary[vendor_field.lower()])
except:
fields_converted.append(vendor_field)
# Otherwise used stored configuration files (every field needs to be
# defined!)
else:
for vendor_field in vendor_fields_list:
try:
v = self.config.convert_vendor_to_library_field(
data_source, vendor_field)
except:
logger = LoggerManager().getLogger(__name__)
logger.error(
"Couldn't find field conversion, did you type it "
"correctly: " + vendor_field +
", using 'close' as default.")
v = 'close'
fields_converted.append(v)
return fields_converted
# Translate findatapy ticker to vendor ticker
def translate_from_vendor_ticker(self, vendor_tickers_list, md_request):
"""Converts all the fields from vendor tickers to findatapy tickers
Parameters
----------
md_request : MarketDataRequest
contains all the various parameters detailing time series start
and finish, tickers etc
Returns
-------
List of Strings
"""
if md_request.vendor_tickers is not None:
dictionary = dict(
zip(self.get_lower_case_list(md_request.vendor_tickers),
md_request.tickers))
tickers_stuff = []
for vendor_ticker in vendor_tickers_list:
tickers_stuff.append(dictionary[vendor_ticker.lower()])
return tickers_stuff
# tickers_list = md_request.tickers
if isinstance(vendor_tickers_list, str):
vendor_tickers_list = [vendor_tickers_list]
if self.config is None: return vendor_tickers_list
tickers_converted = []
for vendor_ticker in vendor_tickers_list:
try:
v = self.config.convert_vendor_to_library_ticker(
md_request.category, md_request.data_source,
md_request.freq, md_request.cut, vendor_ticker)
except:
logger = LoggerManager().getLogger(__name__)
logger.error("Couldn't find ticker conversion, "
"did you type it correctly: " + vendor_ticker)
return
tickers_converted.append(v)
return tickers_converted
def get_lower_case_list(self, lst):
return [k.lower() for k in lst]
```
|
```go
// +build !linux
package plugin // import "github.com/docker/docker/plugin"
import (
"context"
"errors"
"io"
"net/http"
"github.com/docker/distribution/reference"
"github.com/docker/docker/api/types"
"github.com/docker/docker/api/types/filters"
)
var errNotSupported = errors.New("plugins are not supported on this platform")
// Disable deactivates a plugin, which implies that they cannot be used by containers.
func (pm *Manager) Disable(name string, config *types.PluginDisableConfig) error {
return errNotSupported
}
// Enable activates a plugin, which implies that they are ready to be used by containers.
func (pm *Manager) Enable(name string, config *types.PluginEnableConfig) error {
return errNotSupported
}
// Inspect examines a plugin config
func (pm *Manager) Inspect(refOrID string) (tp *types.Plugin, err error) {
return nil, errNotSupported
}
// Privileges pulls a plugin config and computes the privileges required to install it.
func (pm *Manager) Privileges(ctx context.Context, ref reference.Named, metaHeader http.Header, authConfig *types.AuthConfig) (types.PluginPrivileges, error) {
return nil, errNotSupported
}
// Pull pulls a plugin, check if the correct privileges are provided and install the plugin.
func (pm *Manager) Pull(ctx context.Context, ref reference.Named, name string, metaHeader http.Header, authConfig *types.AuthConfig, privileges types.PluginPrivileges, out io.Writer, opts ...CreateOpt) error {
return errNotSupported
}
// Upgrade pulls a plugin, check if the correct privileges are provided and install the plugin.
func (pm *Manager) Upgrade(ctx context.Context, ref reference.Named, name string, metaHeader http.Header, authConfig *types.AuthConfig, privileges types.PluginPrivileges, outStream io.Writer) error {
return errNotSupported
}
// List displays the list of plugins and associated metadata.
func (pm *Manager) List(pluginFilters filters.Args) ([]types.Plugin, error) {
return nil, errNotSupported
}
// Push pushes a plugin to the store.
func (pm *Manager) Push(ctx context.Context, name string, metaHeader http.Header, authConfig *types.AuthConfig, out io.Writer) error {
return errNotSupported
}
// Remove deletes plugin's root directory.
func (pm *Manager) Remove(name string, config *types.PluginRmConfig) error {
return errNotSupported
}
// Set sets plugin args
func (pm *Manager) Set(name string, args []string) error {
return errNotSupported
}
// CreateFromContext creates a plugin from the given pluginDir which contains
// both the rootfs and the config.json and a repoName with optional tag.
func (pm *Manager) CreateFromContext(ctx context.Context, tarCtx io.ReadCloser, options *types.PluginCreateOptions) error {
return errNotSupported
}
```
|
```smalltalk
using EColor = ElmSharp.Color;
namespace Xamarin.Forms.Platform.Tizen
{
public static class ColorExtensions
{
/// <summary>
/// Creates an instance of ElmSharp.Color class based on provided Xamarin.Forms.Color instance
/// </summary>
/// <returns>ElmSharp.Color instance representing a color which corresponds to the provided Xamarin.Forms.Color</returns>
/// <param name="c">The Xamarin.Forms.Color instance which will be converted to a ElmSharp.Color</param>
public static EColor ToNative(this Color c)
{
if (c.IsDefault)
{
// Trying to convert the default color, this may result in black color.
return EColor.Default;
}
else
{
return new EColor((int)(255.0 * c.R), (int)(255.0 * c.G), (int)(255.0 * c.B), (int)(255.0 * c.A));
}
}
public static Color WithAlpha(this Color color, double alpha)
{
return new Color(color.R, color.G, color.B, (int)(255 * alpha));
}
public static Color WithPremultiplied(this Color color, double alpha)
{
return new Color((int)(color.R * alpha), (int)(color.G * alpha), (int)(color.B * alpha), color.A);
}
/// <summary>
/// Returns a string representing the provided ElmSharp.Color instance in a hexagonal notation
/// </summary>
/// <returns>string value containing the encoded color</returns>
/// <param name="c">The ElmSharp.Color class instance which will be serialized</param>
internal static string ToHex(this EColor c)
{
if (c.IsDefault)
{
Log.Warn("Trying to convert the default color to hexagonal notation, it does not works as expected.");
}
return string.Format("#{0:X2}{1:X2}{2:X2}{3:X2}", c.R, c.G, c.B, c.A);
}
}
}
```
|
Carolína Acuña Díaz González (1894 - 1996) was a businesswoman and activist in the Denver Latinx community. In the 1950's, she opened "Casa Mayan," a prominent restaurant and community gathering space.
Biography
González was born in El Paso, Texas on February 18, 1894.
Early life
Ramon Gonzalez's family moved from Chihuahua, Mexico to El Paso, Texas during the Mexican Revolution in the 1910's. Ramon and Carolína migrated to Southern Colorado and then Denver in 1918 and lived in the Auraria neighborhood.
Denver and the Casa Mayan
González created a safe haven during the Depression for youths who were "riding the rails" to Colorado.
In the 1934, Ramon and Carolína purchased 1020 9th Street in Denver, Colorado. This was the oldest clapboard house built by Dr. William Smedley in 1872. Their home was known for their generous hospitality. In the 1940's, their neighborhood was one that was redlined by the City of Denver to delineate cultural minority neighborhoods and prevented investment and development.
In 1946, the family home evolved into the Casa Mayan restaurant, which was one of the first Latinx-owned Mexican American restaurants in Denver. The restaurant was renowned for its embodied elements of Mexican hospitality and generosity. It became a "mutualista" or refuge for immigrants in Colorado.
Casa Mayan became a cultural center for artists, musicians, athletes, politicians, and architects, and advocacy groups like the West Side Coalition. Diners included Tex Ritter, William Shirer, Andrés Segovia, Marian Anderson, Paul Robeson, and many prominent locals. The restaurant became a kind of salon. President Harry Truman ate at the Casa Mayan in 1948.
Casa Mayan closed in 1974 by the Denver Urban Renewal Authority when the neighborhood was razed to build the Auraria Campus. The home was spared demolition and declared a landmark later that year. González moved from Auraria to Athmar Park, where there was no sense of community.
Marriage and children
Carolína married Ramon González in the 1910's-1920's. Ramon died in 1960.
At the time of her death, she was survived by her children: sons Ralph, Ramon and Arnold; four daughters, Maria Zimmermann, Belen Aranda and Marta Alcaro, and Celia Chia; 29 grandchildren; 40 great-grandchildren; and several great-great-grandchildren.
Death and legacy
Her former residence is now part of the National Register of Historic Places in Denver. It is one of the oldest remaining homes in the area, as most of the neighborhood was demolished for the Auraria Campus.
In 2006, Gregorio Alcaro and Trini H. González cofounded the Auraria Casa Mayan Heritage organization to protect the memory and community awareness of the cultural heritage of the area.
In 2020, González was inducted to the Colorado Women's Hall of Fame for her advocacy.
See also
Auraria Campus
Auraria, Denver
Chicano movement
National Register of Historic Places listings in Colorado
References/Notes and references
External links
Interview with Carolina Gonzalez from the Colorado Women's Hall of Fame
1894 births
1996 deaths
People from El Paso, Texas
People from Denver
American people of Mexican descent
|
Vincent Ward may refer to:
Vincent Ward (director) (born 1956), New Zealand film director, screenwriter and artist
Vincent Ward (politician) (1886–1946), New Zealand politician
Vincent M. Ward (born 1971), American actor
|
```html
<html lang="en">
<head>
<title>Optimize Options - Using the GNU Compiler Collection (GCC)</title>
<meta http-equiv="Content-Type" content="text/html">
<meta name="description" content="Using the GNU Compiler Collection (GCC)">
<meta name="generator" content="makeinfo 4.11">
<link title="Top" rel="start" href="index.html#Top">
<link rel="up" href="Invoking-GCC.html#Invoking-GCC" title="Invoking GCC">
<link rel="prev" href="Debugging-Options.html#Debugging-Options" title="Debugging Options">
<link rel="next" href="Preprocessor-Options.html#Preprocessor-Options" title="Preprocessor Options">
<link href="path_to_url" rel="generator-home" title="Texinfo Homepage">
<!--
Permission is granted to copy, distribute and/or modify this document
any later version published by the Free Software Foundation; with the
Invariant Sections being ``Funding Free Software'', the Front-Cover
Texts being (a) (see below), and with the Back-Cover Texts being (b)
(see below). A copy of the license is included in the section entitled
(a) The FSF's Front-Cover Text is:
A GNU Manual
(b) The FSF's Back-Cover Text is:
You have freedom to copy and modify this GNU Manual, like GNU
software. Copies published by the Free Software Foundation raise
funds for GNU development.-->
<meta http-equiv="Content-Style-Type" content="text/css">
<style type="text/css"><!--
pre.display { font-family:inherit }
pre.format { font-family:inherit }
pre.smalldisplay { font-family:inherit; font-size:smaller }
pre.smallformat { font-family:inherit; font-size:smaller }
pre.smallexample { font-size:smaller }
pre.smalllisp { font-size:smaller }
span.sc { font-variant:small-caps }
span.roman { font-family:serif; font-weight:normal; }
span.sansserif { font-family:sans-serif; font-weight:normal; }
--></style>
</head>
<body>
<div class="node">
<p>
<a name="Optimize-Options"></a>
Next: <a rel="next" accesskey="n" href="Preprocessor-Options.html#Preprocessor-Options">Preprocessor Options</a>,
Previous: <a rel="previous" accesskey="p" href="Debugging-Options.html#Debugging-Options">Debugging Options</a>,
Up: <a rel="up" accesskey="u" href="Invoking-GCC.html#Invoking-GCC">Invoking GCC</a>
<hr>
</div>
<h3 class="section">3.10 Options That Control Optimization</h3>
<p><a name="index-optimize-options-881"></a><a name="index-options_002c-optimization-882"></a>
These options control various sorts of optimizations.
<p>Without any optimization option, the compiler's goal is to reduce the
cost of compilation and to make debugging produce the expected
results. Statements are independent: if you stop the program with a
breakpoint between statements, you can then assign a new value to any
variable or change the program counter to any other statement in the
function and get exactly the results you expect from the source
code.
<p>Turning on optimization flags makes the compiler attempt to improve
the performance and/or code size at the expense of compilation time
and possibly the ability to debug the program.
<p>The compiler performs optimization based on the knowledge it has of the
program. Compiling multiple files at once to a single output file mode allows
the compiler to use information gained from all of the files when compiling
each of them.
<p>Not all optimizations are controlled directly by a flag. Only
optimizations that have a flag are listed in this section.
<p>Most optimizations are only enabled if an <samp><span class="option">-O</span></samp> level is set on
the command line. Otherwise they are disabled, even if individual
optimization flags are specified.
<p>Depending on the target and how GCC was configured, a slightly different
set of optimizations may be enabled at each <samp><span class="option">-O</span></samp> level than
those listed here. You can invoke GCC with <samp><span class="option">-Q --help=optimizers</span></samp>
to find out the exact set of optimizations that are enabled at each level.
See <a href="Overall-Options.html#Overall-Options">Overall Options</a>, for examples.
<dl>
<dt><code>-O</code><dt><code>-O1</code><dd><a name="index-O-883"></a><a name="index-O1-884"></a>Optimize. Optimizing compilation takes somewhat more time, and a lot
more memory for a large function.
<p>With <samp><span class="option">-O</span></samp>, the compiler tries to reduce code size and execution
time, without performing any optimizations that take a great deal of
compilation time.
<p><samp><span class="option">-O</span></samp> turns on the following optimization flags:
<pre class="smallexample"> -fauto-inc-dec
-fbranch-count-reg
-fcombine-stack-adjustments
-fcompare-elim
-fcprop-registers
-fdce
-fdefer-pop
-fdelayed-branch
-fdse
-fforward-propagate
-fguess-branch-probability
-fif-conversion2
-fif-conversion
-finline-functions-called-once
-fipa-pure-const
-fipa-profile
-fipa-reference
-fmerge-constants
-fmove-loop-invariants
-fshrink-wrap
-fsplit-wide-types
-ftree-bit-ccp
-ftree-ccp
-fssa-phiopt
-ftree-ch
-ftree-copy-prop
-ftree-copyrename
-ftree-dce
-ftree-dominator-opts
-ftree-dse
-ftree-forwprop
-ftree-fre
-ftree-phiprop
-ftree-sink
-ftree-slsr
-ftree-sra
-ftree-pta
-ftree-ter
-funit-at-a-time
</pre>
<p><samp><span class="option">-O</span></samp> also turns on <samp><span class="option">-fomit-frame-pointer</span></samp> on machines
where doing so does not interfere with debugging.
<br><dt><code>-O2</code><dd><a name="index-O2-885"></a>Optimize even more. GCC performs nearly all supported optimizations
that do not involve a space-speed tradeoff.
As compared to <samp><span class="option">-O</span></samp>, this option increases both compilation time
and the performance of the generated code.
<p><samp><span class="option">-O2</span></samp> turns on all optimization flags specified by <samp><span class="option">-O</span></samp>. It
also turns on the following optimization flags:
<pre class="smallexample"> -fthread-jumps
-falign-functions -falign-jumps
-falign-loops -falign-labels
-fcaller-saves
-fcrossjumping
-fcse-follow-jumps -fcse-skip-blocks
-fdelete-null-pointer-checks
-fdevirtualize -fdevirtualize-speculatively
-fexpensive-optimizations
-fgcse -fgcse-lm
-fhoist-adjacent-loads
-finline-small-functions
-findirect-inlining
-fipa-cp
-fipa-cp-alignment
-fipa-sra
-fipa-icf
-fisolate-erroneous-paths-dereference
-flra-remat
-foptimize-sibling-calls
-foptimize-strlen
-fpartial-inlining
-fpeephole2
-freorder-blocks -freorder-blocks-and-partition -freorder-functions
-frerun-cse-after-loop
-fsched-interblock -fsched-spec
-fschedule-insns -fschedule-insns2
-fstrict-aliasing -fstrict-overflow
-ftree-builtin-call-dce
-ftree-switch-conversion -ftree-tail-merge
-ftree-pre
-ftree-vrp
-fipa-ra
</pre>
<p>Please note the warning under <samp><span class="option">-fgcse</span></samp> about
invoking <samp><span class="option">-O2</span></samp> on programs that use computed gotos.
<br><dt><code>-O3</code><dd><a name="index-O3-886"></a>Optimize yet more. <samp><span class="option">-O3</span></samp> turns on all optimizations specified
by <samp><span class="option">-O2</span></samp> and also turns on the <samp><span class="option">-finline-functions</span></samp>,
<samp><span class="option">-funswitch-loops</span></samp>, <samp><span class="option">-fpredictive-commoning</span></samp>,
<samp><span class="option">-fgcse-after-reload</span></samp>, <samp><span class="option">-ftree-loop-vectorize</span></samp>,
<samp><span class="option">-ftree-loop-distribute-patterns</span></samp>,
<samp><span class="option">-ftree-slp-vectorize</span></samp>, <samp><span class="option">-fvect-cost-model</span></samp>,
<samp><span class="option">-ftree-partial-pre</span></samp> and <samp><span class="option">-fipa-cp-clone</span></samp> options.
<br><dt><code>-O0</code><dd><a name="index-O0-887"></a>Reduce compilation time and make debugging produce the expected
results. This is the default.
<br><dt><code>-Os</code><dd><a name="index-Os-888"></a>Optimize for size. <samp><span class="option">-Os</span></samp> enables all <samp><span class="option">-O2</span></samp> optimizations that
do not typically increase code size. It also performs further
optimizations designed to reduce code size.
<p><samp><span class="option">-Os</span></samp> disables the following optimization flags:
<pre class="smallexample"> -falign-functions -falign-jumps -falign-loops
-falign-labels -freorder-blocks -freorder-blocks-and-partition
-fprefetch-loop-arrays
</pre>
<br><dt><code>-Ofast</code><dd><a name="index-Ofast-889"></a>Disregard strict standards compliance. <samp><span class="option">-Ofast</span></samp> enables all
<samp><span class="option">-O3</span></samp> optimizations. It also enables optimizations that are not
valid for all standard-compliant programs.
It turns on <samp><span class="option">-ffast-math</span></samp> and the Fortran-specific
<samp><span class="option">-fno-protect-parens</span></samp> and <samp><span class="option">-fstack-arrays</span></samp>.
<br><dt><code>-Og</code><dd><a name="index-Og-890"></a>Optimize debugging experience. <samp><span class="option">-Og</span></samp> enables optimizations
that do not interfere with debugging. It should be the optimization
level of choice for the standard edit-compile-debug cycle, offering
a reasonable level of optimization while maintaining fast compilation
and a good debugging experience.
<p>If you use multiple <samp><span class="option">-O</span></samp> options, with or without level numbers,
the last such option is the one that is effective.
</dl>
<p>Options of the form <samp><span class="option">-f</span><var>flag</var></samp> specify machine-independent
flags. Most flags have both positive and negative forms; the negative
form of <samp><span class="option">-ffoo</span></samp> is <samp><span class="option">-fno-foo</span></samp>. In the table
below, only one of the forms is listed—the one you typically
use. You can figure out the other form by either removing ‘<samp><span class="samp">no-</span></samp>’
or adding it.
<p>The following options control specific optimizations. They are either
activated by <samp><span class="option">-O</span></samp> options or are related to ones that are. You
can use the following flags in the rare cases when “fine-tuning” of
optimizations to be performed is desired.
<dl>
<dt><code>-fno-defer-pop</code><dd><a name="index-fno_002ddefer_002dpop-891"></a>Always pop the arguments to each function call as soon as that function
returns. For machines that must pop arguments after a function call,
the compiler normally lets arguments accumulate on the stack for several
function calls and pops them all at once.
<p>Disabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fforward-propagate</code><dd><a name="index-fforward_002dpropagate-892"></a>Perform a forward propagation pass on RTL. The pass tries to combine two
instructions and checks if the result can be simplified. If loop unrolling
is active, two passes are performed and the second is scheduled after
loop unrolling.
<p>This option is enabled by default at optimization levels <samp><span class="option">-O</span></samp>,
<samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-ffp-contract=</code><var>style</var><dd><a name="index-ffp_002dcontract-893"></a><samp><span class="option">-ffp-contract=off</span></samp> disables floating-point expression contraction.
<samp><span class="option">-ffp-contract=fast</span></samp> enables floating-point expression contraction
such as forming of fused multiply-add operations if the target has
native support for them.
<samp><span class="option">-ffp-contract=on</span></samp> enables floating-point expression contraction
if allowed by the language standard. This is currently not implemented
and treated equal to <samp><span class="option">-ffp-contract=off</span></samp>.
<p>The default is <samp><span class="option">-ffp-contract=fast</span></samp>.
<br><dt><code>-fomit-frame-pointer</code><dd><a name="index-fomit_002dframe_002dpointer-894"></a>Don't keep the frame pointer in a register for functions that
don't need one. This avoids the instructions to save, set up and
restore frame pointers; it also makes an extra register available
in many functions. <strong>It also makes debugging impossible on
some machines.</strong>
<p>On some machines, such as the VAX, this flag has no effect, because
the standard calling sequence automatically handles the frame pointer
and nothing is saved by pretending it doesn't exist. The
machine-description macro <code>FRAME_POINTER_REQUIRED</code> controls
whether a target machine supports this flag. See <a href="../gccint/Registers.html#Registers">Register Usage</a>.
<p>The default setting (when not optimizing for
size) for 32-bit GNU/Linux x86 and 32-bit Darwin x86 targets is
<samp><span class="option">-fomit-frame-pointer</span></samp>. You can configure GCC with the
<samp><span class="option">--enable-frame-pointer</span></samp> configure option to change the default.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-foptimize-sibling-calls</code><dd><a name="index-foptimize_002dsibling_002dcalls-895"></a>Optimize sibling and tail recursive calls.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-foptimize-strlen</code><dd><a name="index-foptimize_002dstrlen-896"></a>Optimize various standard C string functions (e.g. <code>strlen</code>,
<code>strchr</code> or <code>strcpy</code>) and
their <code>_FORTIFY_SOURCE</code> counterparts into faster alternatives.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-fno-inline</code><dd><a name="index-fno_002dinline-897"></a>Do not expand any functions inline apart from those marked with
the <code>always_inline</code> attribute. This is the default when not
optimizing.
<p>Single functions can be exempted from inlining by marking them
with the <code>noinline</code> attribute.
<br><dt><code>-finline-small-functions</code><dd><a name="index-finline_002dsmall_002dfunctions-898"></a>Integrate functions into their callers when their body is smaller than expected
function call code (so overall size of program gets smaller). The compiler
heuristically decides which functions are simple enough to be worth integrating
in this way. This inlining applies to all functions, even those not declared
inline.
<p>Enabled at level <samp><span class="option">-O2</span></samp>.
<br><dt><code>-findirect-inlining</code><dd><a name="index-findirect_002dinlining-899"></a>Inline also indirect calls that are discovered to be known at compile
time thanks to previous inlining. This option has any effect only
when inlining itself is turned on by the <samp><span class="option">-finline-functions</span></samp>
or <samp><span class="option">-finline-small-functions</span></samp> options.
<p>Enabled at level <samp><span class="option">-O2</span></samp>.
<br><dt><code>-finline-functions</code><dd><a name="index-finline_002dfunctions-900"></a>Consider all functions for inlining, even if they are not declared inline.
The compiler heuristically decides which functions are worth integrating
in this way.
<p>If all calls to a given function are integrated, and the function is
declared <code>static</code>, then the function is normally not output as
assembler code in its own right.
<p>Enabled at level <samp><span class="option">-O3</span></samp>.
<br><dt><code>-finline-functions-called-once</code><dd><a name="index-finline_002dfunctions_002dcalled_002donce-901"></a>Consider all <code>static</code> functions called once for inlining into their
caller even if they are not marked <code>inline</code>. If a call to a given
function is integrated, then the function is not output as assembler code
in its own right.
<p>Enabled at levels <samp><span class="option">-O1</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp> and <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fearly-inlining</code><dd><a name="index-fearly_002dinlining-902"></a>Inline functions marked by <code>always_inline</code> and functions whose body seems
smaller than the function call overhead early before doing
<samp><span class="option">-fprofile-generate</span></samp> instrumentation and real inlining pass. Doing so
makes profiling significantly cheaper and usually inlining faster on programs
having large chains of nested wrapper functions.
<p>Enabled by default.
<br><dt><code>-fipa-sra</code><dd><a name="index-fipa_002dsra-903"></a>Perform interprocedural scalar replacement of aggregates, removal of
unused parameters and replacement of parameters passed by reference
by parameters passed by value.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp> and <samp><span class="option">-Os</span></samp>.
<br><dt><code>-finline-limit=</code><var>n</var><dd><a name="index-finline_002dlimit-904"></a>By default, GCC limits the size of functions that can be inlined. This flag
allows coarse control of this limit. <var>n</var> is the size of functions that
can be inlined in number of pseudo instructions.
<p>Inlining is actually controlled by a number of parameters, which may be
specified individually by using <samp><span class="option">--param </span><var>name</var><span class="option">=</span><var>value</var></samp>.
The <samp><span class="option">-finline-limit=</span><var>n</var></samp> option sets some of these parameters
as follows:
<dl>
<dt><code>max-inline-insns-single</code><dd>is set to <var>n</var>/2.
<br><dt><code>max-inline-insns-auto</code><dd>is set to <var>n</var>/2.
</dl>
<p>See below for a documentation of the individual
parameters controlling inlining and for the defaults of these parameters.
<p><em>Note:</em> there may be no value to <samp><span class="option">-finline-limit</span></samp> that results
in default behavior.
<p><em>Note:</em> pseudo instruction represents, in this particular context, an
abstract measurement of function's size. In no way does it represent a count
of assembly instructions and as such its exact meaning might change from one
release to an another.
<br><dt><code>-fno-keep-inline-dllexport</code><dd><a name="index-fno_002dkeep_002dinline_002ddllexport-905"></a>This is a more fine-grained version of <samp><span class="option">-fkeep-inline-functions</span></samp>,
which applies only to functions that are declared using the <code>dllexport</code>
attribute or declspec (See <a href="Function-Attributes.html#Function-Attributes">Declaring Attributes of Functions</a>.)
<br><dt><code>-fkeep-inline-functions</code><dd><a name="index-fkeep_002dinline_002dfunctions-906"></a>In C, emit <code>static</code> functions that are declared <code>inline</code>
into the object file, even if the function has been inlined into all
of its callers. This switch does not affect functions using the
<code>extern inline</code> extension in GNU C90. In C++, emit any and all
inline functions into the object file.
<br><dt><code>-fkeep-static-consts</code><dd><a name="index-fkeep_002dstatic_002dconsts-907"></a>Emit variables declared <code>static const</code> when optimization isn't turned
on, even if the variables aren't referenced.
<p>GCC enables this option by default. If you want to force the compiler to
check if a variable is referenced, regardless of whether or not
optimization is turned on, use the <samp><span class="option">-fno-keep-static-consts</span></samp> option.
<br><dt><code>-fmerge-constants</code><dd><a name="index-fmerge_002dconstants-908"></a>Attempt to merge identical constants (string constants and floating-point
constants) across compilation units.
<p>This option is the default for optimized compilation if the assembler and
linker support it. Use <samp><span class="option">-fno-merge-constants</span></samp> to inhibit this
behavior.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fmerge-all-constants</code><dd><a name="index-fmerge_002dall_002dconstants-909"></a>Attempt to merge identical constants and identical variables.
<p>This option implies <samp><span class="option">-fmerge-constants</span></samp>. In addition to
<samp><span class="option">-fmerge-constants</span></samp> this considers e.g. even constant initialized
arrays or initialized constant variables with integral or floating-point
types. Languages like C or C++ require each variable, including multiple
instances of the same variable in recursive calls, to have distinct locations,
so using this option results in non-conforming
behavior.
<br><dt><code>-fmodulo-sched</code><dd><a name="index-fmodulo_002dsched-910"></a>Perform swing modulo scheduling immediately before the first scheduling
pass. This pass looks at innermost loops and reorders their
instructions by overlapping different iterations.
<br><dt><code>-fmodulo-sched-allow-regmoves</code><dd><a name="index-fmodulo_002dsched_002dallow_002dregmoves-911"></a>Perform more aggressive SMS-based modulo scheduling with register moves
allowed. By setting this flag certain anti-dependences edges are
deleted, which triggers the generation of reg-moves based on the
life-range analysis. This option is effective only with
<samp><span class="option">-fmodulo-sched</span></samp> enabled.
<br><dt><code>-fno-branch-count-reg</code><dd><a name="index-fno_002dbranch_002dcount_002dreg-912"></a>Do not use “decrement and branch” instructions on a count register,
but instead generate a sequence of instructions that decrement a
register, compare it against zero, then branch based upon the result.
This option is only meaningful on architectures that support such
instructions, which include x86, PowerPC, IA-64 and S/390.
<p>Enabled by default at <samp><span class="option">-O1</span></samp> and higher.
<p>The default is <samp><span class="option">-fbranch-count-reg</span></samp>.
<br><dt><code>-fno-function-cse</code><dd><a name="index-fno_002dfunction_002dcse-913"></a>Do not put function addresses in registers; make each instruction that
calls a constant function contain the function's address explicitly.
<p>This option results in less efficient code, but some strange hacks
that alter the assembler output may be confused by the optimizations
performed when this option is not used.
<p>The default is <samp><span class="option">-ffunction-cse</span></samp>
<br><dt><code>-fno-zero-initialized-in-bss</code><dd><a name="index-fno_002dzero_002dinitialized_002din_002dbss-914"></a>If the target supports a BSS section, GCC by default puts variables that
are initialized to zero into BSS. This can save space in the resulting
code.
<p>This option turns off this behavior because some programs explicitly
rely on variables going to the data section—e.g., so that the
resulting executable can find the beginning of that section and/or make
assumptions based on that.
<p>The default is <samp><span class="option">-fzero-initialized-in-bss</span></samp>.
<br><dt><code>-fthread-jumps</code><dd><a name="index-fthread_002djumps-915"></a>Perform optimizations that check to see if a jump branches to a
location where another comparison subsumed by the first is found. If
so, the first branch is redirected to either the destination of the
second branch or a point immediately following it, depending on whether
the condition is known to be true or false.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fsplit-wide-types</code><dd><a name="index-fsplit_002dwide_002dtypes-916"></a>When using a type that occupies multiple registers, such as <code>long
long</code> on a 32-bit system, split the registers apart and allocate them
independently. This normally generates better code for those types,
but may make debugging more difficult.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>,
<samp><span class="option">-Os</span></samp>.
<br><dt><code>-fcse-follow-jumps</code><dd><a name="index-fcse_002dfollow_002djumps-917"></a>In common subexpression elimination (CSE), scan through jump instructions
when the target of the jump is not reached by any other path. For
example, when CSE encounters an <code>if</code> statement with an
<code>else</code> clause, CSE follows the jump when the condition
tested is false.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fcse-skip-blocks</code><dd><a name="index-fcse_002dskip_002dblocks-918"></a>This is similar to <samp><span class="option">-fcse-follow-jumps</span></samp>, but causes CSE to
follow jumps that conditionally skip over blocks. When CSE
encounters a simple <code>if</code> statement with no else clause,
<samp><span class="option">-fcse-skip-blocks</span></samp> causes CSE to follow the jump around the
body of the <code>if</code>.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-frerun-cse-after-loop</code><dd><a name="index-frerun_002dcse_002dafter_002dloop-919"></a>Re-run common subexpression elimination after loop optimizations are
performed.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fgcse</code><dd><a name="index-fgcse-920"></a>Perform a global common subexpression elimination pass.
This pass also performs global constant and copy propagation.
<p><em>Note:</em> When compiling a program using computed gotos, a GCC
extension, you may get better run-time performance if you disable
the global common subexpression elimination pass by adding
<samp><span class="option">-fno-gcse</span></samp> to the command line.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fgcse-lm</code><dd><a name="index-fgcse_002dlm-921"></a>When <samp><span class="option">-fgcse-lm</span></samp> is enabled, global common subexpression elimination
attempts to move loads that are only killed by stores into themselves. This
allows a loop containing a load/store sequence to be changed to a load outside
the loop, and a copy/store within the loop.
<p>Enabled by default when <samp><span class="option">-fgcse</span></samp> is enabled.
<br><dt><code>-fgcse-sm</code><dd><a name="index-fgcse_002dsm-922"></a>When <samp><span class="option">-fgcse-sm</span></samp> is enabled, a store motion pass is run after
global common subexpression elimination. This pass attempts to move
stores out of loops. When used in conjunction with <samp><span class="option">-fgcse-lm</span></samp>,
loops containing a load/store sequence can be changed to a load before
the loop and a store after the loop.
<p>Not enabled at any optimization level.
<br><dt><code>-fgcse-las</code><dd><a name="index-fgcse_002dlas-923"></a>When <samp><span class="option">-fgcse-las</span></samp> is enabled, the global common subexpression
elimination pass eliminates redundant loads that come after stores to the
same memory location (both partial and full redundancies).
<p>Not enabled at any optimization level.
<br><dt><code>-fgcse-after-reload</code><dd><a name="index-fgcse_002dafter_002dreload-924"></a>When <samp><span class="option">-fgcse-after-reload</span></samp> is enabled, a redundant load elimination
pass is performed after reload. The purpose of this pass is to clean up
redundant spilling.
<br><dt><code>-faggressive-loop-optimizations</code><dd><a name="index-faggressive_002dloop_002doptimizations-925"></a>This option tells the loop optimizer to use language constraints to
derive bounds for the number of iterations of a loop. This assumes that
loop code does not invoke undefined behavior by for example causing signed
integer overflows or out-of-bound array accesses. The bounds for the
number of iterations of a loop are used to guide loop unrolling and peeling
and loop exit test optimizations.
This option is enabled by default.
<br><dt><code>-funsafe-loop-optimizations</code><dd><a name="index-funsafe_002dloop_002doptimizations-926"></a>This option tells the loop optimizer to assume that loop indices do not
overflow, and that loops with nontrivial exit condition are not
infinite. This enables a wider range of loop optimizations even if
the loop optimizer itself cannot prove that these assumptions are valid.
If you use <samp><span class="option">-Wunsafe-loop-optimizations</span></samp>, the compiler warns you
if it finds this kind of loop.
<br><dt><code>-fcrossjumping</code><dd><a name="index-fcrossjumping-927"></a>Perform cross-jumping transformation.
This transformation unifies equivalent code and saves code size. The
resulting code may or may not perform better than without cross-jumping.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fauto-inc-dec</code><dd><a name="index-fauto_002dinc_002ddec-928"></a>Combine increments or decrements of addresses with memory accesses.
This pass is always skipped on architectures that do not have
instructions to support this. Enabled by default at <samp><span class="option">-O</span></samp> and
higher on architectures that support this.
<br><dt><code>-fdce</code><dd><a name="index-fdce-929"></a>Perform dead code elimination (DCE) on RTL.
Enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fdse</code><dd><a name="index-fdse-930"></a>Perform dead store elimination (DSE) on RTL.
Enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fif-conversion</code><dd><a name="index-fif_002dconversion-931"></a>Attempt to transform conditional jumps into branch-less equivalents. This
includes use of conditional moves, min, max, set flags and abs instructions, and
some tricks doable by standard arithmetics. The use of conditional execution
on chips where it is available is controlled by <samp><span class="option">-fif-conversion2</span></samp>.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fif-conversion2</code><dd><a name="index-fif_002dconversion2-932"></a>Use conditional execution (where available) to transform conditional jumps into
branch-less equivalents.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fdeclone-ctor-dtor</code><dd><a name="index-fdeclone_002dctor_002ddtor-933"></a>The C++ ABI requires multiple entry points for constructors and
destructors: one for a base subobject, one for a complete object, and
one for a virtual destructor that calls operator delete afterwards.
For a hierarchy with virtual bases, the base and complete variants are
clones, which means two copies of the function. With this option, the
base and complete variants are changed to be thunks that call a common
implementation.
<p>Enabled by <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fdelete-null-pointer-checks</code><dd><a name="index-fdelete_002dnull_002dpointer_002dchecks-934"></a>Assume that programs cannot safely dereference null pointers, and that
no code or data element resides there. This enables simple constant
folding optimizations at all optimization levels. In addition, other
optimization passes in GCC use this flag to control global dataflow
analyses that eliminate useless checks for null pointers; these assume
that if a pointer is checked after it has already been dereferenced,
it cannot be null.
<p>Note however that in some environments this assumption is not true.
Use <samp><span class="option">-fno-delete-null-pointer-checks</span></samp> to disable this optimization
for programs that depend on that behavior.
<p>Some targets, especially embedded ones, disable this option at all levels.
Otherwise it is enabled at all levels: <samp><span class="option">-O0</span></samp>, <samp><span class="option">-O1</span></samp>,
<samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>. Passes that use the information
are enabled independently at different optimization levels.
<br><dt><code>-fdevirtualize</code><dd><a name="index-fdevirtualize-935"></a>Attempt to convert calls to virtual functions to direct calls. This
is done both within a procedure and interprocedurally as part of
indirect inlining (<samp><span class="option">-findirect-inlining</span></samp>) and interprocedural constant
propagation (<samp><span class="option">-fipa-cp</span></samp>).
Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fdevirtualize-speculatively</code><dd><a name="index-fdevirtualize_002dspeculatively-936"></a>Attempt to convert calls to virtual functions to speculative direct calls.
Based on the analysis of the type inheritance graph, determine for a given call
the set of likely targets. If the set is small, preferably of size 1, change
the call into a conditional deciding between direct and indirect calls. The
speculative calls enable more optimizations, such as inlining. When they seem
useless after further optimization, they are converted back into original form.
<br><dt><code>-fdevirtualize-at-ltrans</code><dd><a name="index-fdevirtualize_002dat_002dltrans-937"></a>Stream extra information needed for aggressive devirtualization when running
the link-time optimizer in local transformation mode.
This option enables more devirtualization but
significantly increases the size of streamed data. For this reason it is
disabled by default.
<br><dt><code>-fexpensive-optimizations</code><dd><a name="index-fexpensive_002doptimizations-938"></a>Perform a number of minor optimizations that are relatively expensive.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-free</code><dd><a name="index-free-939"></a>Attempt to remove redundant extension instructions. This is especially
helpful for the x86-64 architecture, which implicitly zero-extends in 64-bit
registers after writing to their lower 32-bit half.
<p>Enabled for Alpha, AArch64 and x86 at levels <samp><span class="option">-O2</span></samp>,
<samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fno-lifetime-dse</code><dd><a name="index-fno_002dlifetime_002ddse-940"></a>In C++ the value of an object is only affected by changes within its
lifetime: when the constructor begins, the object has an indeterminate
value, and any changes during the lifetime of the object are dead when
the object is destroyed. Normally dead store elimination will take
advantage of this; if your code relies on the value of the object
storage persisting beyond the lifetime of the object, you can use this
flag to disable this optimization.
<br><dt><code>-flive-range-shrinkage</code><dd><a name="index-flive_002drange_002dshrinkage-941"></a>Attempt to decrease register pressure through register live range
shrinkage. This is helpful for fast processors with small or moderate
size register sets.
<br><dt><code>-fira-algorithm=</code><var>algorithm</var><dd><a name="index-fira_002dalgorithm-942"></a>Use the specified coloring algorithm for the integrated register
allocator. The <var>algorithm</var> argument can be ‘<samp><span class="samp">priority</span></samp>’, which
specifies Chow's priority coloring, or ‘<samp><span class="samp">CB</span></samp>’, which specifies
Chaitin-Briggs coloring. Chaitin-Briggs coloring is not implemented
for all architectures, but for those targets that do support it, it is
the default because it generates better code.
<br><dt><code>-fira-region=</code><var>region</var><dd><a name="index-fira_002dregion-943"></a>Use specified regions for the integrated register allocator. The
<var>region</var> argument should be one of the following:
<dl>
<dt>‘<samp><span class="samp">all</span></samp>’<dd>Use all loops as register allocation regions.
This can give the best results for machines with a small and/or
irregular register set.
<br><dt>‘<samp><span class="samp">mixed</span></samp>’<dd>Use all loops except for loops with small register pressure
as the regions. This value usually gives
the best results in most cases and for most architectures,
and is enabled by default when compiling with optimization for speed
(<samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <small class="dots">...</small>).
<br><dt>‘<samp><span class="samp">one</span></samp>’<dd>Use all functions as a single region.
This typically results in the smallest code size, and is enabled by default for
<samp><span class="option">-Os</span></samp> or <samp><span class="option">-O0</span></samp>.
</dl>
<br><dt><code>-fira-hoist-pressure</code><dd><a name="index-fira_002dhoist_002dpressure-944"></a>Use IRA to evaluate register pressure in the code hoisting pass for
decisions to hoist expressions. This option usually results in smaller
code, but it can slow the compiler down.
<p>This option is enabled at level <samp><span class="option">-Os</span></samp> for all targets.
<br><dt><code>-fira-loop-pressure</code><dd><a name="index-fira_002dloop_002dpressure-945"></a>Use IRA to evaluate register pressure in loops for decisions to move
loop invariants. This option usually results in generation
of faster and smaller code on machines with large register files (>= 32
registers), but it can slow the compiler down.
<p>This option is enabled at level <samp><span class="option">-O3</span></samp> for some targets.
<br><dt><code>-fno-ira-share-save-slots</code><dd><a name="index-fno_002dira_002dshare_002dsave_002dslots-946"></a>Disable sharing of stack slots used for saving call-used hard
registers living through a call. Each hard register gets a
separate stack slot, and as a result function stack frames are
larger.
<br><dt><code>-fno-ira-share-spill-slots</code><dd><a name="index-fno_002dira_002dshare_002dspill_002dslots-947"></a>Disable sharing of stack slots allocated for pseudo-registers. Each
pseudo-register that does not get a hard register gets a separate
stack slot, and as a result function stack frames are larger.
<br><dt><code>-fira-verbose=</code><var>n</var><dd><a name="index-fira_002dverbose-948"></a>Control the verbosity of the dump file for the integrated register allocator.
The default value is 5. If the value <var>n</var> is greater or equal to 10,
the dump output is sent to stderr using the same format as <var>n</var> minus 10.
<br><dt><code>-flra-remat</code><dd><a name="index-flra_002dremat-949"></a>Enable CFG-sensitive rematerialization in LRA. Instead of loading
values of spilled pseudos, LRA tries to rematerialize (recalculate)
values if it is profitable.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fdelayed-branch</code><dd><a name="index-fdelayed_002dbranch-950"></a>If supported for the target machine, attempt to reorder instructions
to exploit instruction slots available after delayed branch
instructions.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fschedule-insns</code><dd><a name="index-fschedule_002dinsns-951"></a>If supported for the target machine, attempt to reorder instructions to
eliminate execution stalls due to required data being unavailable. This
helps machines that have slow floating point or memory load instructions
by allowing other instructions to be issued until the result of the load
or floating-point instruction is required.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-fschedule-insns2</code><dd><a name="index-fschedule_002dinsns2-952"></a>Similar to <samp><span class="option">-fschedule-insns</span></samp>, but requests an additional pass of
instruction scheduling after register allocation has been done. This is
especially useful on machines with a relatively small number of
registers and where memory load instructions take more than one cycle.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fno-sched-interblock</code><dd><a name="index-fno_002dsched_002dinterblock-953"></a>Don't schedule instructions across basic blocks. This is normally
enabled by default when scheduling before register allocation, i.e.
with <samp><span class="option">-fschedule-insns</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fno-sched-spec</code><dd><a name="index-fno_002dsched_002dspec-954"></a>Don't allow speculative motion of non-load instructions. This is normally
enabled by default when scheduling before register allocation, i.e.
with <samp><span class="option">-fschedule-insns</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-pressure</code><dd><a name="index-fsched_002dpressure-955"></a>Enable register pressure sensitive insn scheduling before register
allocation. This only makes sense when scheduling before register
allocation is enabled, i.e. with <samp><span class="option">-fschedule-insns</span></samp> or at
<samp><span class="option">-O2</span></samp> or higher. Usage of this option can improve the
generated code and decrease its size by preventing register pressure
increase above the number of available hard registers and subsequent
spills in register allocation.
<br><dt><code>-fsched-spec-load</code><dd><a name="index-fsched_002dspec_002dload-956"></a>Allow speculative motion of some load instructions. This only makes
sense when scheduling before register allocation, i.e. with
<samp><span class="option">-fschedule-insns</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-spec-load-dangerous</code><dd><a name="index-fsched_002dspec_002dload_002ddangerous-957"></a>Allow speculative motion of more load instructions. This only makes
sense when scheduling before register allocation, i.e. with
<samp><span class="option">-fschedule-insns</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-stalled-insns</code><dt><code>-fsched-stalled-insns=</code><var>n</var><dd><a name="index-fsched_002dstalled_002dinsns-958"></a>Define how many insns (if any) can be moved prematurely from the queue
of stalled insns into the ready list during the second scheduling pass.
<samp><span class="option">-fno-sched-stalled-insns</span></samp> means that no insns are moved
prematurely, <samp><span class="option">-fsched-stalled-insns=0</span></samp> means there is no limit
on how many queued insns can be moved prematurely.
<samp><span class="option">-fsched-stalled-insns</span></samp> without a value is equivalent to
<samp><span class="option">-fsched-stalled-insns=1</span></samp>.
<br><dt><code>-fsched-stalled-insns-dep</code><dt><code>-fsched-stalled-insns-dep=</code><var>n</var><dd><a name="index-fsched_002dstalled_002dinsns_002ddep-959"></a>Define how many insn groups (cycles) are examined for a dependency
on a stalled insn that is a candidate for premature removal from the queue
of stalled insns. This has an effect only during the second scheduling pass,
and only if <samp><span class="option">-fsched-stalled-insns</span></samp> is used.
<samp><span class="option">-fno-sched-stalled-insns-dep</span></samp> is equivalent to
<samp><span class="option">-fsched-stalled-insns-dep=0</span></samp>.
<samp><span class="option">-fsched-stalled-insns-dep</span></samp> without a value is equivalent to
<samp><span class="option">-fsched-stalled-insns-dep=1</span></samp>.
<br><dt><code>-fsched2-use-superblocks</code><dd><a name="index-fsched2_002duse_002dsuperblocks-960"></a>When scheduling after register allocation, use superblock scheduling.
This allows motion across basic block boundaries,
resulting in faster schedules. This option is experimental, as not all machine
descriptions used by GCC model the CPU closely enough to avoid unreliable
results from the algorithm.
<p>This only makes sense when scheduling after register allocation, i.e. with
<samp><span class="option">-fschedule-insns2</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-group-heuristic</code><dd><a name="index-fsched_002dgroup_002dheuristic-961"></a>Enable the group heuristic in the scheduler. This heuristic favors
the instruction that belongs to a schedule group. This is enabled
by default when scheduling is enabled, i.e. with <samp><span class="option">-fschedule-insns</span></samp>
or <samp><span class="option">-fschedule-insns2</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-critical-path-heuristic</code><dd><a name="index-fsched_002dcritical_002dpath_002dheuristic-962"></a>Enable the critical-path heuristic in the scheduler. This heuristic favors
instructions on the critical path. This is enabled by default when
scheduling is enabled, i.e. with <samp><span class="option">-fschedule-insns</span></samp>
or <samp><span class="option">-fschedule-insns2</span></samp> or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-spec-insn-heuristic</code><dd><a name="index-fsched_002dspec_002dinsn_002dheuristic-963"></a>Enable the speculative instruction heuristic in the scheduler. This
heuristic favors speculative instructions with greater dependency weakness.
This is enabled by default when scheduling is enabled, i.e.
with <samp><span class="option">-fschedule-insns</span></samp> or <samp><span class="option">-fschedule-insns2</span></samp>
or at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-rank-heuristic</code><dd><a name="index-fsched_002drank_002dheuristic-964"></a>Enable the rank heuristic in the scheduler. This heuristic favors
the instruction belonging to a basic block with greater size or frequency.
This is enabled by default when scheduling is enabled, i.e.
with <samp><span class="option">-fschedule-insns</span></samp> or <samp><span class="option">-fschedule-insns2</span></samp> or
at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-last-insn-heuristic</code><dd><a name="index-fsched_002dlast_002dinsn_002dheuristic-965"></a>Enable the last-instruction heuristic in the scheduler. This heuristic
favors the instruction that is less dependent on the last instruction
scheduled. This is enabled by default when scheduling is enabled,
i.e. with <samp><span class="option">-fschedule-insns</span></samp> or <samp><span class="option">-fschedule-insns2</span></samp> or
at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-fsched-dep-count-heuristic</code><dd><a name="index-fsched_002ddep_002dcount_002dheuristic-966"></a>Enable the dependent-count heuristic in the scheduler. This heuristic
favors the instruction that has more instructions depending on it.
This is enabled by default when scheduling is enabled, i.e.
with <samp><span class="option">-fschedule-insns</span></samp> or <samp><span class="option">-fschedule-insns2</span></samp> or
at <samp><span class="option">-O2</span></samp> or higher.
<br><dt><code>-freschedule-modulo-scheduled-loops</code><dd><a name="index-freschedule_002dmodulo_002dscheduled_002dloops-967"></a>Modulo scheduling is performed before traditional scheduling. If a loop
is modulo scheduled, later scheduling passes may change its schedule.
Use this option to control that behavior.
<br><dt><code>-fselective-scheduling</code><dd><a name="index-fselective_002dscheduling-968"></a>Schedule instructions using selective scheduling algorithm. Selective
scheduling runs instead of the first scheduler pass.
<br><dt><code>-fselective-scheduling2</code><dd><a name="index-fselective_002dscheduling2-969"></a>Schedule instructions using selective scheduling algorithm. Selective
scheduling runs instead of the second scheduler pass.
<br><dt><code>-fsel-sched-pipelining</code><dd><a name="index-fsel_002dsched_002dpipelining-970"></a>Enable software pipelining of innermost loops during selective scheduling.
This option has no effect unless one of <samp><span class="option">-fselective-scheduling</span></samp> or
<samp><span class="option">-fselective-scheduling2</span></samp> is turned on.
<br><dt><code>-fsel-sched-pipelining-outer-loops</code><dd><a name="index-fsel_002dsched_002dpipelining_002douter_002dloops-971"></a>When pipelining loops during selective scheduling, also pipeline outer loops.
This option has no effect unless <samp><span class="option">-fsel-sched-pipelining</span></samp> is turned on.
<br><dt><code>-fsemantic-interposition</code><dd><a name="index-fsemantic_002dinterposition-972"></a>Some object formats, like ELF, allow interposing of symbols by the
dynamic linker.
This means that for symbols exported from the DSO, the compiler cannot perform
interprocedural propagation, inlining and other optimizations in anticipation
that the function or variable in question may change. While this feature is
useful, for example, to rewrite memory allocation functions by a debugging
implementation, it is expensive in the terms of code quality.
With <samp><span class="option">-fno-semantic-interposition</span></samp> the compiler assumes that
if interposition happens for functions the overwriting function will have
precisely the same semantics (and side effects).
Similarly if interposition happens
for variables, the constructor of the variable will be the same. The flag
has no effect for functions explicitly declared inline
(where it is never allowed for interposition to change semantics)
and for symbols explicitly declared weak.
<br><dt><code>-fshrink-wrap</code><dd><a name="index-fshrink_002dwrap-973"></a>Emit function prologues only before parts of the function that need it,
rather than at the top of the function. This flag is enabled by default at
<samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fcaller-saves</code><dd><a name="index-fcaller_002dsaves-974"></a>Enable allocation of values to registers that are clobbered by
function calls, by emitting extra instructions to save and restore the
registers around such calls. Such allocation is done only when it
seems to result in better code.
<p>This option is always enabled by default on certain machines, usually
those which have no call-preserved registers to use instead.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fcombine-stack-adjustments</code><dd><a name="index-fcombine_002dstack_002dadjustments-975"></a>Tracks stack adjustments (pushes and pops) and stack memory references
and then tries to find ways to combine them.
<p>Enabled by default at <samp><span class="option">-O1</span></samp> and higher.
<br><dt><code>-fipa-ra</code><dd><a name="index-fipa_002dra-976"></a>Use caller save registers for allocation if those registers are not used by
any called function. In that case it is not necessary to save and restore
them around calls. This is only possible if called functions are part of
same compilation unit as current function and they are compiled before it.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fconserve-stack</code><dd><a name="index-fconserve_002dstack-977"></a>Attempt to minimize stack usage. The compiler attempts to use less
stack space, even if that makes the program slower. This option
implies setting the <samp><span class="option">large-stack-frame</span></samp> parameter to 100
and the <samp><span class="option">large-stack-frame-growth</span></samp> parameter to 400.
<br><dt><code>-ftree-reassoc</code><dd><a name="index-ftree_002dreassoc-978"></a>Perform reassociation on trees. This flag is enabled by default
at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-pre</code><dd><a name="index-ftree_002dpre-979"></a>Perform partial redundancy elimination (PRE) on trees. This flag is
enabled by default at <samp><span class="option">-O2</span></samp> and <samp><span class="option">-O3</span></samp>.
<br><dt><code>-ftree-partial-pre</code><dd><a name="index-ftree_002dpartial_002dpre-980"></a>Make partial redundancy elimination (PRE) more aggressive. This flag is
enabled by default at <samp><span class="option">-O3</span></samp>.
<br><dt><code>-ftree-forwprop</code><dd><a name="index-ftree_002dforwprop-981"></a>Perform forward propagation on trees. This flag is enabled by default
at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-fre</code><dd><a name="index-ftree_002dfre-982"></a>Perform full redundancy elimination (FRE) on trees. The difference
between FRE and PRE is that FRE only considers expressions
that are computed on all paths leading to the redundant computation.
This analysis is faster than PRE, though it exposes fewer redundancies.
This flag is enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-phiprop</code><dd><a name="index-ftree_002dphiprop-983"></a>Perform hoisting of loads from conditional pointers on trees. This
pass is enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fhoist-adjacent-loads</code><dd><a name="index-fhoist_002dadjacent_002dloads-984"></a>Speculatively hoist loads from both branches of an if-then-else if the
loads are from adjacent locations in the same structure and the target
architecture has a conditional move instruction. This flag is enabled
by default at <samp><span class="option">-O2</span></samp> and higher.
<br><dt><code>-ftree-copy-prop</code><dd><a name="index-ftree_002dcopy_002dprop-985"></a>Perform copy propagation on trees. This pass eliminates unnecessary
copy operations. This flag is enabled by default at <samp><span class="option">-O</span></samp> and
higher.
<br><dt><code>-fipa-pure-const</code><dd><a name="index-fipa_002dpure_002dconst-986"></a>Discover which functions are pure or constant.
Enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fipa-reference</code><dd><a name="index-fipa_002dreference-987"></a>Discover which static variables do not escape the
compilation unit.
Enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fipa-pta</code><dd><a name="index-fipa_002dpta-988"></a>Perform interprocedural pointer analysis and interprocedural modification
and reference analysis. This option can cause excessive memory and
compile-time usage on large compilation units. It is not enabled by
default at any optimization level.
<br><dt><code>-fipa-profile</code><dd><a name="index-fipa_002dprofile-989"></a>Perform interprocedural profile propagation. The functions called only from
cold functions are marked as cold. Also functions executed once (such as
<code>cold</code>, <code>noreturn</code>, static constructors or destructors) are identified. Cold
functions and loop less parts of functions executed once are then optimized for
size.
Enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fipa-cp</code><dd><a name="index-fipa_002dcp-990"></a>Perform interprocedural constant propagation.
This optimization analyzes the program to determine when values passed
to functions are constants and then optimizes accordingly.
This optimization can substantially increase performance
if the application has constants passed to functions.
This flag is enabled by default at <samp><span class="option">-O2</span></samp>, <samp><span class="option">-Os</span></samp> and <samp><span class="option">-O3</span></samp>.
<br><dt><code>-fipa-cp-clone</code><dd><a name="index-fipa_002dcp_002dclone-991"></a>Perform function cloning to make interprocedural constant propagation stronger.
When enabled, interprocedural constant propagation performs function cloning
when externally visible function can be called with constant arguments.
Because this optimization can create multiple copies of functions,
it may significantly increase code size
(see <samp><span class="option">--param ipcp-unit-growth=</span><var>value</var></samp>).
This flag is enabled by default at <samp><span class="option">-O3</span></samp>.
<br><dt><code>-fipa-cp-alignment</code><dd><a name="index-g_t_002dfipa_002dcp_002dalignment-992"></a>When enabled, this optimization propagates alignment of function
parameters to support better vectorization and string operations.
<p>This flag is enabled by default at <samp><span class="option">-O2</span></samp> and <samp><span class="option">-Os</span></samp>. It
requires that <samp><span class="option">-fipa-cp</span></samp> is enabled.
<br><dt><code>-fipa-icf</code><dd><a name="index-fipa_002dicf-993"></a>Perform Identical Code Folding for functions and read-only variables.
The optimization reduces code size and may disturb unwind stacks by replacing
a function by equivalent one with a different name. The optimization works
more effectively with link time optimization enabled.
<p>Nevertheless the behavior is similar to Gold Linker ICF optimization, GCC ICF
works on different levels and thus the optimizations are not same - there are
equivalences that are found only by GCC and equivalences found only by Gold.
<p>This flag is enabled by default at <samp><span class="option">-O2</span></samp> and <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fisolate-erroneous-paths-dereference</code><dd><a name="index-fisolate_002derroneous_002dpaths_002ddereference-994"></a>Detect paths that trigger erroneous or undefined behavior due to
dereferencing a null pointer. Isolate those paths from the main control
flow and turn the statement with erroneous or undefined behavior into a trap.
This flag is enabled by default at <samp><span class="option">-O2</span></samp> and higher.
<br><dt><code>-fisolate-erroneous-paths-attribute</code><dd><a name="index-fisolate_002derroneous_002dpaths_002dattribute-995"></a>Detect paths that trigger erroneous or undefined behavior due a null value
being used in a way forbidden by a <code>returns_nonnull</code> or <code>nonnull</code>
attribute. Isolate those paths from the main control flow and turn the
statement with erroneous or undefined behavior into a trap. This is not
currently enabled, but may be enabled by <samp><span class="option">-O2</span></samp> in the future.
<br><dt><code>-ftree-sink</code><dd><a name="index-ftree_002dsink-996"></a>Perform forward store motion on trees. This flag is
enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-bit-ccp</code><dd><a name="index-ftree_002dbit_002dccp-997"></a>Perform sparse conditional bit constant propagation on trees and propagate
pointer alignment information.
This pass only operates on local scalar variables and is enabled by default
at <samp><span class="option">-O</span></samp> and higher. It requires that <samp><span class="option">-ftree-ccp</span></samp> is enabled.
<br><dt><code>-ftree-ccp</code><dd><a name="index-ftree_002dccp-998"></a>Perform sparse conditional constant propagation (CCP) on trees. This
pass only operates on local scalar variables and is enabled by default
at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-fssa-phiopt</code><dd><a name="index-fssa_002dphiopt-999"></a>Perform pattern matching on SSA PHI nodes to optimize conditional
code. This pass is enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-switch-conversion</code><dd><a name="index-ftree_002dswitch_002dconversion-1000"></a>Perform conversion of simple initializations in a switch to
initializations from a scalar array. This flag is enabled by default
at <samp><span class="option">-O2</span></samp> and higher.
<br><dt><code>-ftree-tail-merge</code><dd><a name="index-ftree_002dtail_002dmerge-1001"></a>Look for identical code sequences. When found, replace one with a jump to the
other. This optimization is known as tail merging or cross jumping. This flag
is enabled by default at <samp><span class="option">-O2</span></samp> and higher. The compilation time
in this pass can
be limited using <samp><span class="option">max-tail-merge-comparisons</span></samp> parameter and
<samp><span class="option">max-tail-merge-iterations</span></samp> parameter.
<br><dt><code>-ftree-dce</code><dd><a name="index-ftree_002ddce-1002"></a>Perform dead code elimination (DCE) on trees. This flag is enabled by
default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-builtin-call-dce</code><dd><a name="index-ftree_002dbuiltin_002dcall_002ddce-1003"></a>Perform conditional dead code elimination (DCE) for calls to built-in functions
that may set <code>errno</code> but are otherwise side-effect free. This flag is
enabled by default at <samp><span class="option">-O2</span></samp> and higher if <samp><span class="option">-Os</span></samp> is not also
specified.
<br><dt><code>-ftree-dominator-opts</code><dd><a name="index-ftree_002ddominator_002dopts-1004"></a>Perform a variety of simple scalar cleanups (constant/copy
propagation, redundancy elimination, range propagation and expression
simplification) based on a dominator tree traversal. This also
performs jump threading (to reduce jumps to jumps). This flag is
enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-dse</code><dd><a name="index-ftree_002ddse-1005"></a>Perform dead store elimination (DSE) on trees. A dead store is a store into
a memory location that is later overwritten by another store without
any intervening loads. In this case the earlier store can be deleted. This
flag is enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-ch</code><dd><a name="index-ftree_002dch-1006"></a>Perform loop header copying on trees. This is beneficial since it increases
effectiveness of code motion optimizations. It also saves one jump. This flag
is enabled by default at <samp><span class="option">-O</span></samp> and higher. It is not enabled
for <samp><span class="option">-Os</span></samp>, since it usually increases code size.
<br><dt><code>-ftree-loop-optimize</code><dd><a name="index-ftree_002dloop_002doptimize-1007"></a>Perform loop optimizations on trees. This flag is enabled by default
at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-loop-linear</code><dd><a name="index-ftree_002dloop_002dlinear-1008"></a>Perform loop interchange transformations on tree. Same as
<samp><span class="option">-floop-interchange</span></samp>. To use this code transformation, GCC has
to be configured with <samp><span class="option">--with-isl</span></samp> to enable the Graphite loop
transformation infrastructure.
<br><dt><code>-floop-interchange</code><dd><a name="index-floop_002dinterchange-1009"></a>Perform loop interchange transformations on loops. Interchanging two
nested loops switches the inner and outer loops. For example, given a
loop like:
<pre class="smallexample"> DO J = 1, M
DO I = 1, N
A(J, I) = A(J, I) * C
ENDDO
ENDDO
</pre>
<p class="noindent">loop interchange transforms the loop as if it were written:
<pre class="smallexample"> DO I = 1, N
DO J = 1, M
A(J, I) = A(J, I) * C
ENDDO
ENDDO
</pre>
<p>which can be beneficial when <code>N</code> is larger than the caches,
because in Fortran, the elements of an array are stored in memory
contiguously by column, and the original loop iterates over rows,
potentially creating at each access a cache miss. This optimization
applies to all the languages supported by GCC and is not limited to
Fortran. To use this code transformation, GCC has to be configured
with <samp><span class="option">--with-isl</span></samp> to enable the Graphite loop transformation
infrastructure.
<br><dt><code>-floop-strip-mine</code><dd><a name="index-floop_002dstrip_002dmine-1010"></a>Perform loop strip mining transformations on loops. Strip mining
splits a loop into two nested loops. The outer loop has strides
equal to the strip size and the inner loop has strides of the
original loop within a strip. The strip length can be changed
using the <samp><span class="option">loop-block-tile-size</span></samp> parameter. For example,
given a loop like:
<pre class="smallexample"> DO I = 1, N
A(I) = A(I) + C
ENDDO
</pre>
<p class="noindent">loop strip mining transforms the loop as if it were written:
<pre class="smallexample"> DO II = 1, N, 51
DO I = II, min (II + 50, N)
A(I) = A(I) + C
ENDDO
ENDDO
</pre>
<p>This optimization applies to all the languages supported by GCC and is
not limited to Fortran. To use this code transformation, GCC has to
be configured with <samp><span class="option">--with-isl</span></samp> to enable the Graphite loop
transformation infrastructure.
<br><dt><code>-floop-block</code><dd><a name="index-floop_002dblock-1011"></a>Perform loop blocking transformations on loops. Blocking strip mines
each loop in the loop nest such that the memory accesses of the
element loops fit inside caches. The strip length can be changed
using the <samp><span class="option">loop-block-tile-size</span></samp> parameter. For example, given
a loop like:
<pre class="smallexample"> DO I = 1, N
DO J = 1, M
A(J, I) = B(I) + C(J)
ENDDO
ENDDO
</pre>
<p class="noindent">loop blocking transforms the loop as if it were written:
<pre class="smallexample"> DO II = 1, N, 51
DO JJ = 1, M, 51
DO I = II, min (II + 50, N)
DO J = JJ, min (JJ + 50, M)
A(J, I) = B(I) + C(J)
ENDDO
ENDDO
ENDDO
ENDDO
</pre>
<p>which can be beneficial when <code>M</code> is larger than the caches,
because the innermost loop iterates over a smaller amount of data
which can be kept in the caches. This optimization applies to all the
languages supported by GCC and is not limited to Fortran. To use this
code transformation, GCC has to be configured with <samp><span class="option">--with-isl</span></samp>
to enable the Graphite loop transformation infrastructure.
<br><dt><code>-fgraphite-identity</code><dd><a name="index-fgraphite_002didentity-1012"></a>Enable the identity transformation for graphite. For every SCoP we generate
the polyhedral representation and transform it back to gimple. Using
<samp><span class="option">-fgraphite-identity</span></samp> we can check the costs or benefits of the
GIMPLE -> GRAPHITE -> GIMPLE transformation. Some minimal optimizations
are also performed by the code generator ISL, like index splitting and
dead code elimination in loops.
<br><dt><code>-floop-nest-optimize</code><dd><a name="index-floop_002dnest_002doptimize-1013"></a>Enable the ISL based loop nest optimizer. This is a generic loop nest
optimizer based on the Pluto optimization algorithms. It calculates a loop
structure optimized for data-locality and parallelism. This option
is experimental.
<br><dt><code>-floop-unroll-and-jam</code><dd><a name="index-floop_002dunroll_002dand_002djam-1014"></a>Enable unroll and jam for the ISL based loop nest optimizer. The unroll
factor can be changed using the <samp><span class="option">loop-unroll-jam-size</span></samp> parameter.
The unrolled dimension (counting from the most inner one) can be changed
using the <samp><span class="option">loop-unroll-jam-depth</span></samp> parameter. .
<br><dt><code>-floop-parallelize-all</code><dd><a name="index-floop_002dparallelize_002dall-1015"></a>Use the Graphite data dependence analysis to identify loops that can
be parallelized. Parallelize all the loops that can be analyzed to
not contain loop carried dependences without checking that it is
profitable to parallelize the loops.
<br><dt><code>-fcheck-data-deps</code><dd><a name="index-fcheck_002ddata_002ddeps-1016"></a>Compare the results of several data dependence analyzers. This option
is used for debugging the data dependence analyzers.
<br><dt><code>-ftree-loop-if-convert</code><dd><a name="index-ftree_002dloop_002dif_002dconvert-1017"></a>Attempt to transform conditional jumps in the innermost loops to
branch-less equivalents. The intent is to remove control-flow from
the innermost loops in order to improve the ability of the
vectorization pass to handle these loops. This is enabled by default
if vectorization is enabled.
<br><dt><code>-ftree-loop-if-convert-stores</code><dd><a name="index-ftree_002dloop_002dif_002dconvert_002dstores-1018"></a>Attempt to also if-convert conditional jumps containing memory writes.
This transformation can be unsafe for multi-threaded programs as it
transforms conditional memory writes into unconditional memory writes.
For example,
<pre class="smallexample"> for (i = 0; i < N; i++)
if (cond)
A[i] = expr;
</pre>
<p>is transformed to
<pre class="smallexample"> for (i = 0; i < N; i++)
A[i] = cond ? expr : A[i];
</pre>
<p>potentially producing data races.
<br><dt><code>-ftree-loop-distribution</code><dd><a name="index-ftree_002dloop_002ddistribution-1019"></a>Perform loop distribution. This flag can improve cache performance on
big loop bodies and allow further loop optimizations, like
parallelization or vectorization, to take place. For example, the loop
<pre class="smallexample"> DO I = 1, N
A(I) = B(I) + C
D(I) = E(I) * F
ENDDO
</pre>
<p>is transformed to
<pre class="smallexample"> DO I = 1, N
A(I) = B(I) + C
ENDDO
DO I = 1, N
D(I) = E(I) * F
ENDDO
</pre>
<br><dt><code>-ftree-loop-distribute-patterns</code><dd><a name="index-ftree_002dloop_002ddistribute_002dpatterns-1020"></a>Perform loop distribution of patterns that can be code generated with
calls to a library. This flag is enabled by default at <samp><span class="option">-O3</span></samp>.
<p>This pass distributes the initialization loops and generates a call to
memset zero. For example, the loop
<pre class="smallexample"> DO I = 1, N
A(I) = 0
B(I) = A(I) + I
ENDDO
</pre>
<p>is transformed to
<pre class="smallexample"> DO I = 1, N
A(I) = 0
ENDDO
DO I = 1, N
B(I) = A(I) + I
ENDDO
</pre>
<p>and the initialization loop is transformed into a call to memset zero.
<br><dt><code>-ftree-loop-im</code><dd><a name="index-ftree_002dloop_002dim-1021"></a>Perform loop invariant motion on trees. This pass moves only invariants that
are hard to handle at RTL level (function calls, operations that expand to
nontrivial sequences of insns). With <samp><span class="option">-funswitch-loops</span></samp> it also moves
operands of conditions that are invariant out of the loop, so that we can use
just trivial invariantness analysis in loop unswitching. The pass also includes
store motion.
<br><dt><code>-ftree-loop-ivcanon</code><dd><a name="index-ftree_002dloop_002divcanon-1022"></a>Create a canonical counter for number of iterations in loops for which
determining number of iterations requires complicated analysis. Later
optimizations then may determine the number easily. Useful especially
in connection with unrolling.
<br><dt><code>-fivopts</code><dd><a name="index-fivopts-1023"></a>Perform induction variable optimizations (strength reduction, induction
variable merging and induction variable elimination) on trees.
<br><dt><code>-ftree-parallelize-loops=n</code><dd><a name="index-ftree_002dparallelize_002dloops-1024"></a>Parallelize loops, i.e., split their iteration space to run in n threads.
This is only possible for loops whose iterations are independent
and can be arbitrarily reordered. The optimization is only
profitable on multiprocessor machines, for loops that are CPU-intensive,
rather than constrained e.g. by memory bandwidth. This option
implies <samp><span class="option">-pthread</span></samp>, and thus is only supported on targets
that have support for <samp><span class="option">-pthread</span></samp>.
<br><dt><code>-ftree-pta</code><dd><a name="index-ftree_002dpta-1025"></a>Perform function-local points-to analysis on trees. This flag is
enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-sra</code><dd><a name="index-ftree_002dsra-1026"></a>Perform scalar replacement of aggregates. This pass replaces structure
references with scalars to prevent committing structures to memory too
early. This flag is enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-copyrename</code><dd><a name="index-ftree_002dcopyrename-1027"></a>Perform copy renaming on trees. This pass attempts to rename compiler
temporaries to other variables at copy locations, usually resulting in
variable names which more closely resemble the original variables. This flag
is enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-coalesce-inlined-vars</code><dd><a name="index-ftree_002dcoalesce_002dinlined_002dvars-1028"></a>Tell the copyrename pass (see <samp><span class="option">-ftree-copyrename</span></samp>) to attempt to
combine small user-defined variables too, but only if they are inlined
from other functions. It is a more limited form of
<samp><span class="option">-ftree-coalesce-vars</span></samp>. This may harm debug information of such
inlined variables, but it keeps variables of the inlined-into
function apart from each other, such that they are more likely to
contain the expected values in a debugging session.
<br><dt><code>-ftree-coalesce-vars</code><dd><a name="index-ftree_002dcoalesce_002dvars-1029"></a>Tell the copyrename pass (see <samp><span class="option">-ftree-copyrename</span></samp>) to attempt to
combine small user-defined variables too, instead of just compiler
temporaries. This may severely limit the ability to debug an optimized
program compiled with <samp><span class="option">-fno-var-tracking-assignments</span></samp>. In the
negated form, this flag prevents SSA coalescing of user variables,
including inlined ones. This option is enabled by default.
<br><dt><code>-ftree-ter</code><dd><a name="index-ftree_002dter-1030"></a>Perform temporary expression replacement during the SSA->normal phase. Single
use/single def temporaries are replaced at their use location with their
defining expression. This results in non-GIMPLE code, but gives the expanders
much more complex trees to work on resulting in better RTL generation. This is
enabled by default at <samp><span class="option">-O</span></samp> and higher.
<br><dt><code>-ftree-slsr</code><dd><a name="index-ftree_002dslsr-1031"></a>Perform straight-line strength reduction on trees. This recognizes related
expressions involving multiplications and replaces them by less expensive
calculations when possible. This is enabled by default at <samp><span class="option">-O</span></samp> and
higher.
<br><dt><code>-ftree-vectorize</code><dd><a name="index-ftree_002dvectorize-1032"></a>Perform vectorization on trees. This flag enables <samp><span class="option">-ftree-loop-vectorize</span></samp>
and <samp><span class="option">-ftree-slp-vectorize</span></samp> if not explicitly specified.
<br><dt><code>-ftree-loop-vectorize</code><dd><a name="index-ftree_002dloop_002dvectorize-1033"></a>Perform loop vectorization on trees. This flag is enabled by default at
<samp><span class="option">-O3</span></samp> and when <samp><span class="option">-ftree-vectorize</span></samp> is enabled.
<br><dt><code>-ftree-slp-vectorize</code><dd><a name="index-ftree_002dslp_002dvectorize-1034"></a>Perform basic block vectorization on trees. This flag is enabled by default at
<samp><span class="option">-O3</span></samp> and when <samp><span class="option">-ftree-vectorize</span></samp> is enabled.
<br><dt><code>-fvect-cost-model=</code><var>model</var><dd><a name="index-fvect_002dcost_002dmodel-1035"></a>Alter the cost model used for vectorization. The <var>model</var> argument
should be one of ‘<samp><span class="samp">unlimited</span></samp>’, ‘<samp><span class="samp">dynamic</span></samp>’ or ‘<samp><span class="samp">cheap</span></samp>’.
With the ‘<samp><span class="samp">unlimited</span></samp>’ model the vectorized code-path is assumed
to be profitable while with the ‘<samp><span class="samp">dynamic</span></samp>’ model a runtime check
guards the vectorized code-path to enable it only for iteration
counts that will likely execute faster than when executing the original
scalar loop. The ‘<samp><span class="samp">cheap</span></samp>’ model disables vectorization of
loops where doing so would be cost prohibitive for example due to
required runtime checks for data dependence or alignment but otherwise
is equal to the ‘<samp><span class="samp">dynamic</span></samp>’ model.
The default cost model depends on other optimization flags and is
either ‘<samp><span class="samp">dynamic</span></samp>’ or ‘<samp><span class="samp">cheap</span></samp>’.
<br><dt><code>-fsimd-cost-model=</code><var>model</var><dd><a name="index-fsimd_002dcost_002dmodel-1036"></a>Alter the cost model used for vectorization of loops marked with the OpenMP
or Cilk Plus simd directive. The <var>model</var> argument should be one of
‘<samp><span class="samp">unlimited</span></samp>’, ‘<samp><span class="samp">dynamic</span></samp>’, ‘<samp><span class="samp">cheap</span></samp>’. All values of <var>model</var>
have the same meaning as described in <samp><span class="option">-fvect-cost-model</span></samp> and by
default a cost model defined with <samp><span class="option">-fvect-cost-model</span></samp> is used.
<br><dt><code>-ftree-vrp</code><dd><a name="index-ftree_002dvrp-1037"></a>Perform Value Range Propagation on trees. This is similar to the
constant propagation pass, but instead of values, ranges of values are
propagated. This allows the optimizers to remove unnecessary range
checks like array bound checks and null pointer checks. This is
enabled by default at <samp><span class="option">-O2</span></samp> and higher. Null pointer check
elimination is only done if <samp><span class="option">-fdelete-null-pointer-checks</span></samp> is
enabled.
<br><dt><code>-fsplit-ivs-in-unroller</code><dd><a name="index-fsplit_002divs_002din_002dunroller-1038"></a>Enables expression of values of induction variables in later iterations
of the unrolled loop using the value in the first iteration. This breaks
long dependency chains, thus improving efficiency of the scheduling passes.
<p>A combination of <samp><span class="option">-fweb</span></samp> and CSE is often sufficient to obtain the
same effect. However, that is not reliable in cases where the loop body
is more complicated than a single basic block. It also does not work at all
on some architectures due to restrictions in the CSE pass.
<p>This optimization is enabled by default.
<br><dt><code>-fvariable-expansion-in-unroller</code><dd><a name="index-fvariable_002dexpansion_002din_002dunroller-1039"></a>With this option, the compiler creates multiple copies of some
local variables when unrolling a loop, which can result in superior code.
<br><dt><code>-fpartial-inlining</code><dd><a name="index-fpartial_002dinlining-1040"></a>Inline parts of functions. This option has any effect only
when inlining itself is turned on by the <samp><span class="option">-finline-functions</span></samp>
or <samp><span class="option">-finline-small-functions</span></samp> options.
<p>Enabled at level <samp><span class="option">-O2</span></samp>.
<br><dt><code>-fpredictive-commoning</code><dd><a name="index-fpredictive_002dcommoning-1041"></a>Perform predictive commoning optimization, i.e., reusing computations
(especially memory loads and stores) performed in previous
iterations of loops.
<p>This option is enabled at level <samp><span class="option">-O3</span></samp>.
<br><dt><code>-fprefetch-loop-arrays</code><dd><a name="index-fprefetch_002dloop_002darrays-1042"></a>If supported by the target machine, generate instructions to prefetch
memory to improve the performance of loops that access large arrays.
<p>This option may generate better or worse code; results are highly
dependent on the structure of loops within the source code.
<p>Disabled at level <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fno-peephole</code><dt><code>-fno-peephole2</code><dd><a name="index-fno_002dpeephole-1043"></a><a name="index-fno_002dpeephole2-1044"></a>Disable any machine-specific peephole optimizations. The difference
between <samp><span class="option">-fno-peephole</span></samp> and <samp><span class="option">-fno-peephole2</span></samp> is in how they
are implemented in the compiler; some targets use one, some use the
other, a few use both.
<p><samp><span class="option">-fpeephole</span></samp> is enabled by default.
<samp><span class="option">-fpeephole2</span></samp> enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fno-guess-branch-probability</code><dd><a name="index-fno_002dguess_002dbranch_002dprobability-1045"></a>Do not guess branch probabilities using heuristics.
<p>GCC uses heuristics to guess branch probabilities if they are
not provided by profiling feedback (<samp><span class="option">-fprofile-arcs</span></samp>). These
heuristics are based on the control flow graph. If some branch probabilities
are specified by <code>__builtin_expect</code>, then the heuristics are
used to guess branch probabilities for the rest of the control flow graph,
taking the <code>__builtin_expect</code> info into account. The interactions
between the heuristics and <code>__builtin_expect</code> can be complex, and in
some cases, it may be useful to disable the heuristics so that the effects
of <code>__builtin_expect</code> are easier to understand.
<p>The default is <samp><span class="option">-fguess-branch-probability</span></samp> at levels
<samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-freorder-blocks</code><dd><a name="index-freorder_002dblocks-1046"></a>Reorder basic blocks in the compiled function in order to reduce number of
taken branches and improve code locality.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-freorder-blocks-and-partition</code><dd><a name="index-freorder_002dblocks_002dand_002dpartition-1047"></a>In addition to reordering basic blocks in the compiled function, in order
to reduce number of taken branches, partitions hot and cold basic blocks
into separate sections of the assembly and .o files, to improve
paging and cache locality performance.
<p>This optimization is automatically turned off in the presence of
exception handling, for linkonce sections, for functions with a user-defined
section attribute and on any architecture that does not support named
sections.
<p>Enabled for x86 at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-freorder-functions</code><dd><a name="index-freorder_002dfunctions-1048"></a>Reorder functions in the object file in order to
improve code locality. This is implemented by using special
subsections <code>.text.hot</code> for most frequently executed functions and
<code>.text.unlikely</code> for unlikely executed functions. Reordering is done by
the linker so object file format must support named sections and linker must
place them in a reasonable way.
<p>Also profile feedback must be available to make this option effective. See
<samp><span class="option">-fprofile-arcs</span></samp> for details.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fstrict-aliasing</code><dd><a name="index-fstrict_002daliasing-1049"></a>Allow the compiler to assume the strictest aliasing rules applicable to
the language being compiled. For C (and C++), this activates
optimizations based on the type of expressions. In particular, an
object of one type is assumed never to reside at the same address as an
object of a different type, unless the types are almost the same. For
example, an <code>unsigned int</code> can alias an <code>int</code>, but not a
<code>void*</code> or a <code>double</code>. A character type may alias any other
type.
<p><a name="Type_002dpunning"></a>Pay special attention to code like this:
<pre class="smallexample"> union a_union {
int i;
double d;
};
int f() {
union a_union t;
t.d = 3.0;
return t.i;
}
</pre>
<p>The practice of reading from a different union member than the one most
recently written to (called “type-punning”) is common. Even with
<samp><span class="option">-fstrict-aliasing</span></samp>, type-punning is allowed, provided the memory
is accessed through the union type. So, the code above works as
expected. See <a href=your_sha256_hash.html#your_sha256_hash>Structures unions enumerations and bit-fields implementation</a>. However, this code might not:
<pre class="smallexample"> int f() {
union a_union t;
int* ip;
t.d = 3.0;
ip = &t.i;
return *ip;
}
</pre>
<p>Similarly, access by taking the address, casting the resulting pointer
and dereferencing the result has undefined behavior, even if the cast
uses a union type, e.g.:
<pre class="smallexample"> int f() {
double d = 3.0;
return ((union a_union *) &d)->i;
}
</pre>
<p>The <samp><span class="option">-fstrict-aliasing</span></samp> option is enabled at levels
<samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fstrict-overflow</code><dd><a name="index-fstrict_002doverflow-1050"></a>Allow the compiler to assume strict signed overflow rules, depending
on the language being compiled. For C (and C++) this means that
overflow when doing arithmetic with signed numbers is undefined, which
means that the compiler may assume that it does not happen. This
permits various optimizations. For example, the compiler assumes
that an expression like <code>i + 10 > i</code> is always true for
signed <code>i</code>. This assumption is only valid if signed overflow is
undefined, as the expression is false if <code>i + 10</code> overflows when
using twos complement arithmetic. When this option is in effect any
attempt to determine whether an operation on signed numbers
overflows must be written carefully to not actually involve overflow.
<p>This option also allows the compiler to assume strict pointer
semantics: given a pointer to an object, if adding an offset to that
pointer does not produce a pointer to the same object, the addition is
undefined. This permits the compiler to conclude that <code>p + u >
p</code> is always true for a pointer <code>p</code> and unsigned integer
<code>u</code>. This assumption is only valid because pointer wraparound is
undefined, as the expression is false if <code>p + u</code> overflows using
twos complement arithmetic.
<p>See also the <samp><span class="option">-fwrapv</span></samp> option. Using <samp><span class="option">-fwrapv</span></samp> means
that integer signed overflow is fully defined: it wraps. When
<samp><span class="option">-fwrapv</span></samp> is used, there is no difference between
<samp><span class="option">-fstrict-overflow</span></samp> and <samp><span class="option">-fno-strict-overflow</span></samp> for
integers. With <samp><span class="option">-fwrapv</span></samp> certain types of overflow are
permitted. For example, if the compiler gets an overflow when doing
arithmetic on constants, the overflowed value can still be used with
<samp><span class="option">-fwrapv</span></samp>, but not otherwise.
<p>The <samp><span class="option">-fstrict-overflow</span></samp> option is enabled at levels
<samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-falign-functions</code><dt><code>-falign-functions=</code><var>n</var><dd><a name="index-falign_002dfunctions-1051"></a>Align the start of functions to the next power-of-two greater than
<var>n</var>, skipping up to <var>n</var> bytes. For instance,
<samp><span class="option">-falign-functions=32</span></samp> aligns functions to the next 32-byte
boundary, but <samp><span class="option">-falign-functions=24</span></samp> aligns to the next
32-byte boundary only if this can be done by skipping 23 bytes or less.
<p><samp><span class="option">-fno-align-functions</span></samp> and <samp><span class="option">-falign-functions=1</span></samp> are
equivalent and mean that functions are not aligned.
<p>Some assemblers only support this flag when <var>n</var> is a power of two;
in that case, it is rounded up.
<p>If <var>n</var> is not specified or is zero, use a machine-dependent default.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-falign-labels</code><dt><code>-falign-labels=</code><var>n</var><dd><a name="index-falign_002dlabels-1052"></a>Align all branch targets to a power-of-two boundary, skipping up to
<var>n</var> bytes like <samp><span class="option">-falign-functions</span></samp>. This option can easily
make code slower, because it must insert dummy operations for when the
branch target is reached in the usual flow of the code.
<p><samp><span class="option">-fno-align-labels</span></samp> and <samp><span class="option">-falign-labels=1</span></samp> are
equivalent and mean that labels are not aligned.
<p>If <samp><span class="option">-falign-loops</span></samp> or <samp><span class="option">-falign-jumps</span></samp> are applicable and
are greater than this value, then their values are used instead.
<p>If <var>n</var> is not specified or is zero, use a machine-dependent default
which is very likely to be ‘<samp><span class="samp">1</span></samp>’, meaning no alignment.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-falign-loops</code><dt><code>-falign-loops=</code><var>n</var><dd><a name="index-falign_002dloops-1053"></a>Align loops to a power-of-two boundary, skipping up to <var>n</var> bytes
like <samp><span class="option">-falign-functions</span></samp>. If the loops are
executed many times, this makes up for any execution of the dummy
operations.
<p><samp><span class="option">-fno-align-loops</span></samp> and <samp><span class="option">-falign-loops=1</span></samp> are
equivalent and mean that loops are not aligned.
<p>If <var>n</var> is not specified or is zero, use a machine-dependent default.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-falign-jumps</code><dt><code>-falign-jumps=</code><var>n</var><dd><a name="index-falign_002djumps-1054"></a>Align branch targets to a power-of-two boundary, for branch targets
where the targets can only be reached by jumping, skipping up to <var>n</var>
bytes like <samp><span class="option">-falign-functions</span></samp>. In this case, no dummy operations
need be executed.
<p><samp><span class="option">-fno-align-jumps</span></samp> and <samp><span class="option">-falign-jumps=1</span></samp> are
equivalent and mean that loops are not aligned.
<p>If <var>n</var> is not specified or is zero, use a machine-dependent default.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>.
<br><dt><code>-funit-at-a-time</code><dd><a name="index-funit_002dat_002da_002dtime-1055"></a>This option is left for compatibility reasons. <samp><span class="option">-funit-at-a-time</span></samp>
has no effect, while <samp><span class="option">-fno-unit-at-a-time</span></samp> implies
<samp><span class="option">-fno-toplevel-reorder</span></samp> and <samp><span class="option">-fno-section-anchors</span></samp>.
<p>Enabled by default.
<br><dt><code>-fno-toplevel-reorder</code><dd><a name="index-fno_002dtoplevel_002dreorder-1056"></a>Do not reorder top-level functions, variables, and <code>asm</code>
statements. Output them in the same order that they appear in the
input file. When this option is used, unreferenced static variables
are not removed. This option is intended to support existing code
that relies on a particular ordering. For new code, it is better to
use attributes when possible.
<p>Enabled at level <samp><span class="option">-O0</span></samp>. When disabled explicitly, it also implies
<samp><span class="option">-fno-section-anchors</span></samp>, which is otherwise enabled at <samp><span class="option">-O0</span></samp> on some
targets.
<br><dt><code>-fweb</code><dd><a name="index-fweb-1057"></a>Constructs webs as commonly used for register allocation purposes and assign
each web individual pseudo register. This allows the register allocation pass
to operate on pseudos directly, but also strengthens several other optimization
passes, such as CSE, loop optimizer and trivial dead code remover. It can,
however, make debugging impossible, since variables no longer stay in a
“home register”.
<p>Enabled by default with <samp><span class="option">-funroll-loops</span></samp>.
<br><dt><code>-fwhole-program</code><dd><a name="index-fwhole_002dprogram-1058"></a>Assume that the current compilation unit represents the whole program being
compiled. All public functions and variables with the exception of <code>main</code>
and those merged by attribute <code>externally_visible</code> become static functions
and in effect are optimized more aggressively by interprocedural optimizers.
<p>This option should not be used in combination with <samp><span class="option">-flto</span></samp>.
Instead relying on a linker plugin should provide safer and more precise
information.
<br><dt><code>-flto[=</code><var>n</var><code>]</code><dd><a name="index-flto-1059"></a>This option runs the standard link-time optimizer. When invoked
with source code, it generates GIMPLE (one of GCC's internal
representations) and writes it to special ELF sections in the object
file. When the object files are linked together, all the function
bodies are read from these ELF sections and instantiated as if they
had been part of the same translation unit.
<p>To use the link-time optimizer, <samp><span class="option">-flto</span></samp> and optimization
options should be specified at compile time and during the final link.
For example:
<pre class="smallexample"> gcc -c -O2 -flto foo.c
gcc -c -O2 -flto bar.c
gcc -o myprog -flto -O2 foo.o bar.o
</pre>
<p>The first two invocations to GCC save a bytecode representation
of GIMPLE into special ELF sections inside <samp><span class="file">foo.o</span></samp> and
<samp><span class="file">bar.o</span></samp>. The final invocation reads the GIMPLE bytecode from
<samp><span class="file">foo.o</span></samp> and <samp><span class="file">bar.o</span></samp>, merges the two files into a single
internal image, and compiles the result as usual. Since both
<samp><span class="file">foo.o</span></samp> and <samp><span class="file">bar.o</span></samp> are merged into a single image, this
causes all the interprocedural analyses and optimizations in GCC to
work across the two files as if they were a single one. This means,
for example, that the inliner is able to inline functions in
<samp><span class="file">bar.o</span></samp> into functions in <samp><span class="file">foo.o</span></samp> and vice-versa.
<p>Another (simpler) way to enable link-time optimization is:
<pre class="smallexample"> gcc -o myprog -flto -O2 foo.c bar.c
</pre>
<p>The above generates bytecode for <samp><span class="file">foo.c</span></samp> and <samp><span class="file">bar.c</span></samp>,
merges them together into a single GIMPLE representation and optimizes
them as usual to produce <samp><span class="file">myprog</span></samp>.
<p>The only important thing to keep in mind is that to enable link-time
optimizations you need to use the GCC driver to perform the link-step.
GCC then automatically performs link-time optimization if any of the
objects involved were compiled with the <samp><span class="option">-flto</span></samp> command-line option.
You generally
should specify the optimization options to be used for link-time
optimization though GCC tries to be clever at guessing an
optimization level to use from the options used at compile-time
if you fail to specify one at link-time. You can always override
the automatic decision to do link-time optimization at link-time
by passing <samp><span class="option">-fno-lto</span></samp> to the link command.
<p>To make whole program optimization effective, it is necessary to make
certain whole program assumptions. The compiler needs to know
what functions and variables can be accessed by libraries and runtime
outside of the link-time optimized unit. When supported by the linker,
the linker plugin (see <samp><span class="option">-fuse-linker-plugin</span></samp>) passes information
to the compiler about used and externally visible symbols. When
the linker plugin is not available, <samp><span class="option">-fwhole-program</span></samp> should be
used to allow the compiler to make these assumptions, which leads
to more aggressive optimization decisions.
<p>When <samp><span class="option">-fuse-linker-plugin</span></samp> is not enabled then, when a file is
compiled with <samp><span class="option">-flto</span></samp>, the generated object file is larger than
a regular object file because it contains GIMPLE bytecodes and the usual
final code (see <samp><span class="option">-ffat-lto-objects</span></samp>. This means that
object files with LTO information can be linked as normal object
files; if <samp><span class="option">-fno-lto</span></samp> is passed to the linker, no
interprocedural optimizations are applied. Note that when
<samp><span class="option">-fno-fat-lto-objects</span></samp> is enabled the compile-stage is faster
but you cannot perform a regular, non-LTO link on them.
<p>Additionally, the optimization flags used to compile individual files
are not necessarily related to those used at link time. For instance,
<pre class="smallexample"> gcc -c -O0 -ffat-lto-objects -flto foo.c
gcc -c -O0 -ffat-lto-objects -flto bar.c
gcc -o myprog -O3 foo.o bar.o
</pre>
<p>This produces individual object files with unoptimized assembler
code, but the resulting binary <samp><span class="file">myprog</span></samp> is optimized at
<samp><span class="option">-O3</span></samp>. If, instead, the final binary is generated with
<samp><span class="option">-fno-lto</span></samp>, then <samp><span class="file">myprog</span></samp> is not optimized.
<p>When producing the final binary, GCC only
applies link-time optimizations to those files that contain bytecode.
Therefore, you can mix and match object files and libraries with
GIMPLE bytecodes and final object code. GCC automatically selects
which files to optimize in LTO mode and which files to link without
further processing.
<p>There are some code generation flags preserved by GCC when
generating bytecodes, as they need to be used during the final link
stage. Generally options specified at link-time override those
specified at compile-time.
<p>If you do not specify an optimization level option <samp><span class="option">-O</span></samp> at
link-time then GCC computes one based on the optimization levels
used when compiling the object files. The highest optimization
level wins here.
<p>Currently, the following options and their setting are take from
the first object file that explicitely specified it:
<samp><span class="option">-fPIC</span></samp>, <samp><span class="option">-fpic</span></samp>, <samp><span class="option">-fpie</span></samp>, <samp><span class="option">-fcommon</span></samp>,
<samp><span class="option">-fexceptions</span></samp>, <samp><span class="option">-fnon-call-exceptions</span></samp>, <samp><span class="option">-fgnu-tm</span></samp>
and all the <samp><span class="option">-m</span></samp> target flags.
<p>Certain ABI changing flags are required to match in all compilation-units
and trying to override this at link-time with a conflicting value
is ignored. This includes options such as <samp><span class="option">-freg-struct-return</span></samp>
and <samp><span class="option">-fpcc-struct-return</span></samp>.
<p>Other options such as <samp><span class="option">-ffp-contract</span></samp>, <samp><span class="option">-fno-strict-overflow</span></samp>,
<samp><span class="option">-fwrapv</span></samp>, <samp><span class="option">-fno-trapv</span></samp> or <samp><span class="option">-fno-strict-aliasing</span></samp>
are passed through to the link stage and merged conservatively for
conflicting translation units. Specifically
<samp><span class="option">-fno-strict-overflow</span></samp>, <samp><span class="option">-fwrapv</span></samp> and <samp><span class="option">-fno-trapv</span></samp> take
precedence and for example <samp><span class="option">-ffp-contract=off</span></samp> takes precedence
over <samp><span class="option">-ffp-contract=fast</span></samp>. You can override them at linke-time.
<p>It is recommended that you compile all the files participating in the
same link with the same options and also specify those options at
link time.
<p>If LTO encounters objects with C linkage declared with incompatible
types in separate translation units to be linked together (undefined
behavior according to ISO C99 6.2.7), a non-fatal diagnostic may be
issued. The behavior is still undefined at run time. Similar
diagnostics may be raised for other languages.
<p>Another feature of LTO is that it is possible to apply interprocedural
optimizations on files written in different languages:
<pre class="smallexample"> gcc -c -flto foo.c
g++ -c -flto bar.cc
gfortran -c -flto baz.f90
g++ -o myprog -flto -O3 foo.o bar.o baz.o -lgfortran
</pre>
<p>Notice that the final link is done with <samp><span class="command">g++</span></samp> to get the C++
runtime libraries and <samp><span class="option">-lgfortran</span></samp> is added to get the Fortran
runtime libraries. In general, when mixing languages in LTO mode, you
should use the same link command options as when mixing languages in a
regular (non-LTO) compilation.
<p>If object files containing GIMPLE bytecode are stored in a library archive, say
<samp><span class="file">libfoo.a</span></samp>, it is possible to extract and use them in an LTO link if you
are using a linker with plugin support. To create static libraries suitable
for LTO, use <samp><span class="command">gcc-ar</span></samp> and <samp><span class="command">gcc-ranlib</span></samp> instead of <samp><span class="command">ar</span></samp>
and <samp><span class="command">ranlib</span></samp>;
to show the symbols of object files with GIMPLE bytecode, use
<samp><span class="command">gcc-nm</span></samp>. Those commands require that <samp><span class="command">ar</span></samp>, <samp><span class="command">ranlib</span></samp>
and <samp><span class="command">nm</span></samp> have been compiled with plugin support. At link time, use the the
flag <samp><span class="option">-fuse-linker-plugin</span></samp> to ensure that the library participates in
the LTO optimization process:
<pre class="smallexample"> gcc -o myprog -O2 -flto -fuse-linker-plugin a.o b.o -lfoo
</pre>
<p>With the linker plugin enabled, the linker extracts the needed
GIMPLE files from <samp><span class="file">libfoo.a</span></samp> and passes them on to the running GCC
to make them part of the aggregated GIMPLE image to be optimized.
<p>If you are not using a linker with plugin support and/or do not
enable the linker plugin, then the objects inside <samp><span class="file">libfoo.a</span></samp>
are extracted and linked as usual, but they do not participate
in the LTO optimization process. In order to make a static library suitable
for both LTO optimization and usual linkage, compile its object files with
<samp><span class="option">-flto</span></samp> <samp><span class="option">-ffat-lto-objects</span></samp>.
<p>Link-time optimizations do not require the presence of the whole program to
operate. If the program does not require any symbols to be exported, it is
possible to combine <samp><span class="option">-flto</span></samp> and <samp><span class="option">-fwhole-program</span></samp> to allow
the interprocedural optimizers to use more aggressive assumptions which may
lead to improved optimization opportunities.
Use of <samp><span class="option">-fwhole-program</span></samp> is not needed when linker plugin is
active (see <samp><span class="option">-fuse-linker-plugin</span></samp>).
<p>The current implementation of LTO makes no
attempt to generate bytecode that is portable between different
types of hosts. The bytecode files are versioned and there is a
strict version check, so bytecode files generated in one version of
GCC do not work with an older or newer version of GCC.
<p>Link-time optimization does not work well with generation of debugging
information. Combining <samp><span class="option">-flto</span></samp> with
<samp><span class="option">-g</span></samp> is currently experimental and expected to produce unexpected
results.
<p>If you specify the optional <var>n</var>, the optimization and code
generation done at link time is executed in parallel using <var>n</var>
parallel jobs by utilizing an installed <samp><span class="command">make</span></samp> program. The
environment variable <samp><span class="env">MAKE</span></samp> may be used to override the program
used. The default value for <var>n</var> is 1.
<p>You can also specify <samp><span class="option">-flto=jobserver</span></samp> to use GNU make's
job server mode to determine the number of parallel jobs. This
is useful when the Makefile calling GCC is already executing in parallel.
You must prepend a ‘<samp><span class="samp">+</span></samp>’ to the command recipe in the parent Makefile
for this to work. This option likely only works if <samp><span class="env">MAKE</span></samp> is
GNU make.
<br><dt><code>-flto-partition=</code><var>alg</var><dd><a name="index-flto_002dpartition-1060"></a>Specify the partitioning algorithm used by the link-time optimizer.
The value is either ‘<samp><span class="samp">1to1</span></samp>’ to specify a partitioning mirroring
the original source files or ‘<samp><span class="samp">balanced</span></samp>’ to specify partitioning
into equally sized chunks (whenever possible) or ‘<samp><span class="samp">max</span></samp>’ to create
new partition for every symbol where possible. Specifying ‘<samp><span class="samp">none</span></samp>’
as an algorithm disables partitioning and streaming completely.
The default value is ‘<samp><span class="samp">balanced</span></samp>’. While ‘<samp><span class="samp">1to1</span></samp>’ can be used
as an workaround for various code ordering issues, the ‘<samp><span class="samp">max</span></samp>’
partitioning is intended for internal testing only.
The value ‘<samp><span class="samp">one</span></samp>’ specifies that exactly one partition should be
used while the value ‘<samp><span class="samp">none</span></samp>’ bypasses partitioning and executes
the link-time optimization step directly from the WPA phase.
<br><dt><code>-flto-odr-type-merging</code><dd><a name="index-flto_002dodr_002dtype_002dmerging-1061"></a>Enable streaming of mangled types names of C++ types and their unification
at linktime. This increases size of LTO object files, but enable
diagnostics about One Definition Rule violations.
<br><dt><code>-flto-compression-level=</code><var>n</var><dd><a name="index-flto_002dcompression_002dlevel-1062"></a>This option specifies the level of compression used for intermediate
language written to LTO object files, and is only meaningful in
conjunction with LTO mode (<samp><span class="option">-flto</span></samp>). Valid
values are 0 (no compression) to 9 (maximum compression). Values
outside this range are clamped to either 0 or 9. If the option is not
given, a default balanced compression setting is used.
<br><dt><code>-flto-report</code><dd><a name="index-flto_002dreport-1063"></a>Prints a report with internal details on the workings of the link-time
optimizer. The contents of this report vary from version to version.
It is meant to be useful to GCC developers when processing object
files in LTO mode (via <samp><span class="option">-flto</span></samp>).
<p>Disabled by default.
<br><dt><code>-flto-report-wpa</code><dd><a name="index-flto_002dreport_002dwpa-1064"></a>Like <samp><span class="option">-flto-report</span></samp>, but only print for the WPA phase of Link
Time Optimization.
<br><dt><code>-fuse-linker-plugin</code><dd><a name="index-fuse_002dlinker_002dplugin-1065"></a>Enables the use of a linker plugin during link-time optimization. This
option relies on plugin support in the linker, which is available in gold
or in GNU ld 2.21 or newer.
<p>This option enables the extraction of object files with GIMPLE bytecode out
of library archives. This improves the quality of optimization by exposing
more code to the link-time optimizer. This information specifies what
symbols can be accessed externally (by non-LTO object or during dynamic
linking). Resulting code quality improvements on binaries (and shared
libraries that use hidden visibility) are similar to <samp><span class="option">-fwhole-program</span></samp>.
See <samp><span class="option">-flto</span></samp> for a description of the effect of this flag and how to
use it.
<p>This option is enabled by default when LTO support in GCC is enabled
and GCC was configured for use with
a linker supporting plugins (GNU ld 2.21 or newer or gold).
<br><dt><code>-ffat-lto-objects</code><dd><a name="index-ffat_002dlto_002dobjects-1066"></a>Fat LTO objects are object files that contain both the intermediate language
and the object code. This makes them usable for both LTO linking and normal
linking. This option is effective only when compiling with <samp><span class="option">-flto</span></samp>
and is ignored at link time.
<p><samp><span class="option">-fno-fat-lto-objects</span></samp> improves compilation time over plain LTO, but
requires the complete toolchain to be aware of LTO. It requires a linker with
linker plugin support for basic functionality. Additionally,
<samp><span class="command">nm</span></samp>, <samp><span class="command">ar</span></samp> and <samp><span class="command">ranlib</span></samp>
need to support linker plugins to allow a full-featured build environment
(capable of building static libraries etc). GCC provides the <samp><span class="command">gcc-ar</span></samp>,
<samp><span class="command">gcc-nm</span></samp>, <samp><span class="command">gcc-ranlib</span></samp> wrappers to pass the right options
to these tools. With non fat LTO makefiles need to be modified to use them.
<p>The default is <samp><span class="option">-fno-fat-lto-objects</span></samp> on targets with linker plugin
support.
<br><dt><code>-fcompare-elim</code><dd><a name="index-fcompare_002delim-1067"></a>After register allocation and post-register allocation instruction splitting,
identify arithmetic instructions that compute processor flags similar to a
comparison operation based on that arithmetic. If possible, eliminate the
explicit comparison operation.
<p>This pass only applies to certain targets that cannot explicitly represent
the comparison operation before register allocation is complete.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fcprop-registers</code><dd><a name="index-fcprop_002dregisters-1068"></a>After register allocation and post-register allocation instruction splitting,
perform a copy-propagation pass to try to reduce scheduling dependencies
and occasionally eliminate the copy.
<p>Enabled at levels <samp><span class="option">-O</span></samp>, <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-fprofile-correction</code><dd><a name="index-fprofile_002dcorrection-1069"></a>Profiles collected using an instrumented binary for multi-threaded programs may
be inconsistent due to missed counter updates. When this option is specified,
GCC uses heuristics to correct or smooth out such inconsistencies. By
default, GCC emits an error message when an inconsistent profile is detected.
<br><dt><code>-fprofile-dir=</code><var>path</var><dd><a name="index-fprofile_002ddir-1070"></a>
Set the directory to search for the profile data files in to <var>path</var>.
This option affects only the profile data generated by
<samp><span class="option">-fprofile-generate</span></samp>, <samp><span class="option">-ftest-coverage</span></samp>, <samp><span class="option">-fprofile-arcs</span></samp>
and used by <samp><span class="option">-fprofile-use</span></samp> and <samp><span class="option">-fbranch-probabilities</span></samp>
and its related options. Both absolute and relative paths can be used.
By default, GCC uses the current directory as <var>path</var>, thus the
profile data file appears in the same directory as the object file.
<br><dt><code>-fprofile-generate</code><dt><code>-fprofile-generate=</code><var>path</var><dd><a name="index-fprofile_002dgenerate-1071"></a>
Enable options usually used for instrumenting application to produce
profile useful for later recompilation with profile feedback based
optimization. You must use <samp><span class="option">-fprofile-generate</span></samp> both when
compiling and when linking your program.
<p>The following options are enabled: <samp><span class="option">-fprofile-arcs</span></samp>, <samp><span class="option">-fprofile-values</span></samp>, <samp><span class="option">-fvpt</span></samp>.
<p>If <var>path</var> is specified, GCC looks at the <var>path</var> to find
the profile feedback data files. See <samp><span class="option">-fprofile-dir</span></samp>.
<br><dt><code>-fprofile-use</code><dt><code>-fprofile-use=</code><var>path</var><dd><a name="index-fprofile_002duse-1072"></a>Enable profile feedback-directed optimizations,
and the following optimizations
which are generally profitable only with profile feedback available:
<samp><span class="option">-fbranch-probabilities</span></samp>, <samp><span class="option">-fvpt</span></samp>,
<samp><span class="option">-funroll-loops</span></samp>, <samp><span class="option">-fpeel-loops</span></samp>, <samp><span class="option">-ftracer</span></samp>,
<samp><span class="option">-ftree-vectorize</span></samp>, and <samp><span class="option">ftree-loop-distribute-patterns</span></samp>.
<p>By default, GCC emits an error message if the feedback profiles do not
match the source code. This error can be turned into a warning by using
<samp><span class="option">-Wcoverage-mismatch</span></samp>. Note this may result in poorly optimized
code.
<p>If <var>path</var> is specified, GCC looks at the <var>path</var> to find
the profile feedback data files. See <samp><span class="option">-fprofile-dir</span></samp>.
<br><dt><code>-fauto-profile</code><dt><code>-fauto-profile=</code><var>path</var><dd><a name="index-fauto_002dprofile-1073"></a>Enable sampling-based feedback-directed optimizations,
and the following optimizations
which are generally profitable only with profile feedback available:
<samp><span class="option">-fbranch-probabilities</span></samp>, <samp><span class="option">-fvpt</span></samp>,
<samp><span class="option">-funroll-loops</span></samp>, <samp><span class="option">-fpeel-loops</span></samp>, <samp><span class="option">-ftracer</span></samp>,
<samp><span class="option">-ftree-vectorize</span></samp>,
<samp><span class="option">-finline-functions</span></samp>, <samp><span class="option">-fipa-cp</span></samp>, <samp><span class="option">-fipa-cp-clone</span></samp>,
<samp><span class="option">-fpredictive-commoning</span></samp>, <samp><span class="option">-funswitch-loops</span></samp>,
<samp><span class="option">-fgcse-after-reload</span></samp>, and <samp><span class="option">-ftree-loop-distribute-patterns</span></samp>.
<p><var>path</var> is the name of a file containing AutoFDO profile information.
If omitted, it defaults to <samp><span class="file">fbdata.afdo</span></samp> in the current directory.
<p>Producing an AutoFDO profile data file requires running your program
with the <samp><span class="command">perf</span></samp> utility on a supported GNU/Linux target system.
For more information, see <a href="path_to_url">path_to_url
<p>E.g.
<pre class="smallexample"> perf record -e br_inst_retired:near_taken -b -o perf.data \
-- your_program
</pre>
<p>Then use the <samp><span class="command">create_gcov</span></samp> tool to convert the raw profile data
to a format that can be used by GCC. You must also supply the
unstripped binary for your program to this tool.
See <a href="path_to_url">path_to_url
<p>E.g.
<pre class="smallexample"> create_gcov --binary=your_program.unstripped --profile=perf.data \
--gcov=profile.afdo
</pre>
</dl>
<p>The following options control compiler behavior regarding floating-point
arithmetic. These options trade off between speed and
correctness. All must be specifically enabled.
<dl>
<dt><code>-ffloat-store</code><dd><a name="index-ffloat_002dstore-1074"></a>Do not store floating-point variables in registers, and inhibit other
options that might change whether a floating-point value is taken from a
register or memory.
<p><a name="index-floating_002dpoint-precision-1075"></a>This option prevents undesirable excess precision on machines such as
the 68000 where the floating registers (of the 68881) keep more
precision than a <code>double</code> is supposed to have. Similarly for the
x86 architecture. For most programs, the excess precision does only
good, but a few programs rely on the precise definition of IEEE floating
point. Use <samp><span class="option">-ffloat-store</span></samp> for such programs, after modifying
them to store all pertinent intermediate computations into variables.
<br><dt><code>-fexcess-precision=</code><var>style</var><dd><a name="index-fexcess_002dprecision-1076"></a>This option allows further control over excess precision on machines
where floating-point registers have more precision than the IEEE
<code>float</code> and <code>double</code> types and the processor does not
support operations rounding to those types. By default,
<samp><span class="option">-fexcess-precision=fast</span></samp> is in effect; this means that
operations are carried out in the precision of the registers and that
it is unpredictable when rounding to the types specified in the source
code takes place. When compiling C, if
<samp><span class="option">-fexcess-precision=standard</span></samp> is specified then excess
precision follows the rules specified in ISO C99; in particular,
both casts and assignments cause values to be rounded to their
semantic types (whereas <samp><span class="option">-ffloat-store</span></samp> only affects
assignments). This option is enabled by default for C if a strict
conformance option such as <samp><span class="option">-std=c99</span></samp> is used.
<p><a name="index-mfpmath-1077"></a><samp><span class="option">-fexcess-precision=standard</span></samp> is not implemented for languages
other than C, and has no effect if
<samp><span class="option">-funsafe-math-optimizations</span></samp> or <samp><span class="option">-ffast-math</span></samp> is
specified. On the x86, it also has no effect if <samp><span class="option">-mfpmath=sse</span></samp>
or <samp><span class="option">-mfpmath=sse+387</span></samp> is specified; in the former case, IEEE
semantics apply without excess precision, and in the latter, rounding
is unpredictable.
<br><dt><code>-ffast-math</code><dd><a name="index-ffast_002dmath-1078"></a>Sets the options <samp><span class="option">-fno-math-errno</span></samp>, <samp><span class="option">-funsafe-math-optimizations</span></samp>,
<samp><span class="option">-ffinite-math-only</span></samp>, <samp><span class="option">-fno-rounding-math</span></samp>,
<samp><span class="option">-fno-signaling-nans</span></samp> and <samp><span class="option">-fcx-limited-range</span></samp>.
<p>This option causes the preprocessor macro <code>__FAST_MATH__</code> to be defined.
<p>This option is not turned on by any <samp><span class="option">-O</span></samp> option besides
<samp><span class="option">-Ofast</span></samp> since it can result in incorrect output for programs
that depend on an exact implementation of IEEE or ISO rules/specifications
for math functions. It may, however, yield faster code for programs
that do not require the guarantees of these specifications.
<br><dt><code>-fno-math-errno</code><dd><a name="index-fno_002dmath_002derrno-1079"></a>Do not set <code>errno</code> after calling math functions that are executed
with a single instruction, e.g., <code>sqrt</code>. A program that relies on
IEEE exceptions for math error handling may want to use this flag
for speed while maintaining IEEE arithmetic compatibility.
<p>This option is not turned on by any <samp><span class="option">-O</span></samp> option since
it can result in incorrect output for programs that depend on
an exact implementation of IEEE or ISO rules/specifications for
math functions. It may, however, yield faster code for programs
that do not require the guarantees of these specifications.
<p>The default is <samp><span class="option">-fmath-errno</span></samp>.
<p>On Darwin systems, the math library never sets <code>errno</code>. There is
therefore no reason for the compiler to consider the possibility that
it might, and <samp><span class="option">-fno-math-errno</span></samp> is the default.
<br><dt><code>-funsafe-math-optimizations</code><dd><a name="index-funsafe_002dmath_002doptimizations-1080"></a>
Allow optimizations for floating-point arithmetic that (a) assume
that arguments and results are valid and (b) may violate IEEE or
ANSI standards. When used at link-time, it may include libraries
or startup files that change the default FPU control word or other
similar optimizations.
<p>This option is not turned on by any <samp><span class="option">-O</span></samp> option since
it can result in incorrect output for programs that depend on
an exact implementation of IEEE or ISO rules/specifications for
math functions. It may, however, yield faster code for programs
that do not require the guarantees of these specifications.
Enables <samp><span class="option">-fno-signed-zeros</span></samp>, <samp><span class="option">-fno-trapping-math</span></samp>,
<samp><span class="option">-fassociative-math</span></samp> and <samp><span class="option">-freciprocal-math</span></samp>.
<p>The default is <samp><span class="option">-fno-unsafe-math-optimizations</span></samp>.
<br><dt><code>-fassociative-math</code><dd><a name="index-fassociative_002dmath-1081"></a>
Allow re-association of operands in series of floating-point operations.
This violates the ISO C and C++ language standard by possibly changing
computation result. NOTE: re-ordering may change the sign of zero as
well as ignore NaNs and inhibit or create underflow or overflow (and
thus cannot be used on code that relies on rounding behavior like
<code>(x + 2**52) - 2**52</code>. May also reorder floating-point comparisons
and thus may not be used when ordered comparisons are required.
This option requires that both <samp><span class="option">-fno-signed-zeros</span></samp> and
<samp><span class="option">-fno-trapping-math</span></samp> be in effect. Moreover, it doesn't make
much sense with <samp><span class="option">-frounding-math</span></samp>. For Fortran the option
is automatically enabled when both <samp><span class="option">-fno-signed-zeros</span></samp> and
<samp><span class="option">-fno-trapping-math</span></samp> are in effect.
<p>The default is <samp><span class="option">-fno-associative-math</span></samp>.
<br><dt><code>-freciprocal-math</code><dd><a name="index-freciprocal_002dmath-1082"></a>
Allow the reciprocal of a value to be used instead of dividing by
the value if this enables optimizations. For example <code>x / y</code>
can be replaced with <code>x * (1/y)</code>, which is useful if <code>(1/y)</code>
is subject to common subexpression elimination. Note that this loses
precision and increases the number of flops operating on the value.
<p>The default is <samp><span class="option">-fno-reciprocal-math</span></samp>.
<br><dt><code>-ffinite-math-only</code><dd><a name="index-ffinite_002dmath_002donly-1083"></a>Allow optimizations for floating-point arithmetic that assume
that arguments and results are not NaNs or +-Infs.
<p>This option is not turned on by any <samp><span class="option">-O</span></samp> option since
it can result in incorrect output for programs that depend on
an exact implementation of IEEE or ISO rules/specifications for
math functions. It may, however, yield faster code for programs
that do not require the guarantees of these specifications.
<p>The default is <samp><span class="option">-fno-finite-math-only</span></samp>.
<br><dt><code>-fno-signed-zeros</code><dd><a name="index-fno_002dsigned_002dzeros-1084"></a>Allow optimizations for floating-point arithmetic that ignore the
signedness of zero. IEEE arithmetic specifies the behavior of
distinct +0.0 and −0.0 values, which then prohibits simplification
of expressions such as x+0.0 or 0.0*x (even with <samp><span class="option">-ffinite-math-only</span></samp>).
This option implies that the sign of a zero result isn't significant.
<p>The default is <samp><span class="option">-fsigned-zeros</span></samp>.
<br><dt><code>-fno-trapping-math</code><dd><a name="index-fno_002dtrapping_002dmath-1085"></a>Compile code assuming that floating-point operations cannot generate
user-visible traps. These traps include division by zero, overflow,
underflow, inexact result and invalid operation. This option requires
that <samp><span class="option">-fno-signaling-nans</span></samp> be in effect. Setting this option may
allow faster code if one relies on “non-stop” IEEE arithmetic, for example.
<p>This option should never be turned on by any <samp><span class="option">-O</span></samp> option since
it can result in incorrect output for programs that depend on
an exact implementation of IEEE or ISO rules/specifications for
math functions.
<p>The default is <samp><span class="option">-ftrapping-math</span></samp>.
<br><dt><code>-frounding-math</code><dd><a name="index-frounding_002dmath-1086"></a>Disable transformations and optimizations that assume default floating-point
rounding behavior. This is round-to-zero for all floating point
to integer conversions, and round-to-nearest for all other arithmetic
truncations. This option should be specified for programs that change
the FP rounding mode dynamically, or that may be executed with a
non-default rounding mode. This option disables constant folding of
floating-point expressions at compile time (which may be affected by
rounding mode) and arithmetic transformations that are unsafe in the
presence of sign-dependent rounding modes.
<p>The default is <samp><span class="option">-fno-rounding-math</span></samp>.
<p>This option is experimental and does not currently guarantee to
disable all GCC optimizations that are affected by rounding mode.
Future versions of GCC may provide finer control of this setting
using C99's <code>FENV_ACCESS</code> pragma. This command-line option
will be used to specify the default state for <code>FENV_ACCESS</code>.
<br><dt><code>-fsignaling-nans</code><dd><a name="index-fsignaling_002dnans-1087"></a>Compile code assuming that IEEE signaling NaNs may generate user-visible
traps during floating-point operations. Setting this option disables
optimizations that may change the number of exceptions visible with
signaling NaNs. This option implies <samp><span class="option">-ftrapping-math</span></samp>.
<p>This option causes the preprocessor macro <code>__SUPPORT_SNAN__</code> to
be defined.
<p>The default is <samp><span class="option">-fno-signaling-nans</span></samp>.
<p>This option is experimental and does not currently guarantee to
disable all GCC optimizations that affect signaling NaN behavior.
<br><dt><code>-fsingle-precision-constant</code><dd><a name="index-fsingle_002dprecision_002dconstant-1088"></a>Treat floating-point constants as single precision instead of
implicitly converting them to double-precision constants.
<br><dt><code>-fcx-limited-range</code><dd><a name="index-fcx_002dlimited_002drange-1089"></a>When enabled, this option states that a range reduction step is not
needed when performing complex division. Also, there is no checking
whether the result of a complex multiplication or division is <code>NaN
+ I*NaN</code>, with an attempt to rescue the situation in that case. The
default is <samp><span class="option">-fno-cx-limited-range</span></samp>, but is enabled by
<samp><span class="option">-ffast-math</span></samp>.
<p>This option controls the default setting of the ISO C99
<code>CX_LIMITED_RANGE</code> pragma. Nevertheless, the option applies to
all languages.
<br><dt><code>-fcx-fortran-rules</code><dd><a name="index-fcx_002dfortran_002drules-1090"></a>Complex multiplication and division follow Fortran rules. Range
reduction is done as part of complex division, but there is no checking
whether the result of a complex multiplication or division is <code>NaN
+ I*NaN</code>, with an attempt to rescue the situation in that case.
<p>The default is <samp><span class="option">-fno-cx-fortran-rules</span></samp>.
</dl>
<p>The following options control optimizations that may improve
performance, but are not enabled by any <samp><span class="option">-O</span></samp> options. This
section includes experimental options that may produce broken code.
<dl>
<dt><code>-fbranch-probabilities</code><dd><a name="index-fbranch_002dprobabilities-1091"></a>After running a program compiled with <samp><span class="option">-fprofile-arcs</span></samp>
(see <a href="Debugging-Options.html#Debugging-Options">Options for Debugging Your Program or <samp><span class="command">gcc</span></samp></a>), you can compile it a second time using
<samp><span class="option">-fbranch-probabilities</span></samp>, to improve optimizations based on
the number of times each branch was taken. When a program
compiled with <samp><span class="option">-fprofile-arcs</span></samp> exits, it saves arc execution
counts to a file called <samp><var>sourcename</var><span class="file">.gcda</span></samp> for each source
file. The information in this data file is very dependent on the
structure of the generated code, so you must use the same source code
and the same optimization options for both compilations.
<p>With <samp><span class="option">-fbranch-probabilities</span></samp>, GCC puts a
‘<samp><span class="samp">REG_BR_PROB</span></samp>’ note on each ‘<samp><span class="samp">JUMP_INSN</span></samp>’ and ‘<samp><span class="samp">CALL_INSN</span></samp>’.
These can be used to improve optimization. Currently, they are only
used in one place: in <samp><span class="file">reorg.c</span></samp>, instead of guessing which path a
branch is most likely to take, the ‘<samp><span class="samp">REG_BR_PROB</span></samp>’ values are used to
exactly determine which path is taken more often.
<br><dt><code>-fprofile-values</code><dd><a name="index-fprofile_002dvalues-1092"></a>If combined with <samp><span class="option">-fprofile-arcs</span></samp>, it adds code so that some
data about values of expressions in the program is gathered.
<p>With <samp><span class="option">-fbranch-probabilities</span></samp>, it reads back the data gathered
from profiling values of expressions for usage in optimizations.
<p>Enabled with <samp><span class="option">-fprofile-generate</span></samp> and <samp><span class="option">-fprofile-use</span></samp>.
<br><dt><code>-fprofile-reorder-functions</code><dd><a name="index-fprofile_002dreorder_002dfunctions-1093"></a>Function reordering based on profile instrumentation collects
first time of execution of a function and orders these functions
in ascending order.
<p>Enabled with <samp><span class="option">-fprofile-use</span></samp>.
<br><dt><code>-fvpt</code><dd><a name="index-fvpt-1094"></a>If combined with <samp><span class="option">-fprofile-arcs</span></samp>, this option instructs the compiler
to add code to gather information about values of expressions.
<p>With <samp><span class="option">-fbranch-probabilities</span></samp>, it reads back the data gathered
and actually performs the optimizations based on them.
Currently the optimizations include specialization of division operations
using the knowledge about the value of the denominator.
<br><dt><code>-frename-registers</code><dd><a name="index-frename_002dregisters-1095"></a>Attempt to avoid false dependencies in scheduled code by making use
of registers left over after register allocation. This optimization
most benefits processors with lots of registers. Depending on the
debug information format adopted by the target, however, it can
make debugging impossible, since variables no longer stay in
a “home register”.
<p>Enabled by default with <samp><span class="option">-funroll-loops</span></samp> and <samp><span class="option">-fpeel-loops</span></samp>.
<br><dt><code>-fschedule-fusion</code><dd><a name="index-fschedule_002dfusion-1096"></a>Performs a target dependent pass over the instruction stream to schedule
instructions of same type together because target machine can execute them
more efficiently if they are adjacent to each other in the instruction flow.
<p>Enabled at levels <samp><span class="option">-O2</span></samp>, <samp><span class="option">-O3</span></samp>, <samp><span class="option">-Os</span></samp>.
<br><dt><code>-ftracer</code><dd><a name="index-ftracer-1097"></a>Perform tail duplication to enlarge superblock size. This transformation
simplifies the control flow of the function allowing other optimizations to do
a better job.
<p>Enabled with <samp><span class="option">-fprofile-use</span></samp>.
<br><dt><code>-funroll-loops</code><dd><a name="index-funroll_002dloops-1098"></a>Unroll loops whose number of iterations can be determined at compile time or
upon entry to the loop. <samp><span class="option">-funroll-loops</span></samp> implies
<samp><span class="option">-frerun-cse-after-loop</span></samp>, <samp><span class="option">-fweb</span></samp> and <samp><span class="option">-frename-registers</span></samp>.
It also turns on complete loop peeling (i.e. complete removal of loops with
a small constant number of iterations). This option makes code larger, and may
or may not make it run faster.
<p>Enabled with <samp><span class="option">-fprofile-use</span></samp>.
<br><dt><code>-funroll-all-loops</code><dd><a name="index-funroll_002dall_002dloops-1099"></a>Unroll all loops, even if their number of iterations is uncertain when
the loop is entered. This usually makes programs run more slowly.
<samp><span class="option">-funroll-all-loops</span></samp> implies the same options as
<samp><span class="option">-funroll-loops</span></samp>.
<br><dt><code>-fpeel-loops</code><dd><a name="index-fpeel_002dloops-1100"></a>Peels loops for which there is enough information that they do not
roll much (from profile feedback). It also turns on complete loop peeling
(i.e. complete removal of loops with small constant number of iterations).
<p>Enabled with <samp><span class="option">-fprofile-use</span></samp>.
<br><dt><code>-fmove-loop-invariants</code><dd><a name="index-fmove_002dloop_002dinvariants-1101"></a>Enables the loop invariant motion pass in the RTL loop optimizer. Enabled
at level <samp><span class="option">-O1</span></samp>
<br><dt><code>-funswitch-loops</code><dd><a name="index-funswitch_002dloops-1102"></a>Move branches with loop invariant conditions out of the loop, with duplicates
of the loop on both branches (modified according to result of the condition).
<br><dt><code>-ffunction-sections</code><dt><code>-fdata-sections</code><dd><a name="index-ffunction_002dsections-1103"></a><a name="index-fdata_002dsections-1104"></a>Place each function or data item into its own section in the output
file if the target supports arbitrary sections. The name of the
function or the name of the data item determines the section's name
in the output file.
<p>Use these options on systems where the linker can perform optimizations
to improve locality of reference in the instruction space. Most systems
using the ELF object format and SPARC processors running Solaris 2 have
linkers with such optimizations. AIX may have these optimizations in
the future.
<p>Only use these options when there are significant benefits from doing
so. When you specify these options, the assembler and linker
create larger object and executable files and are also slower.
You cannot use <samp><span class="command">gprof</span></samp> on all systems if you
specify this option, and you may have problems with debugging if
you specify both this option and <samp><span class="option">-g</span></samp>.
<br><dt><code>-fbranch-target-load-optimize</code><dd><a name="index-fbranch_002dtarget_002dload_002doptimize-1105"></a>Perform branch target register load optimization before prologue / epilogue
threading.
The use of target registers can typically be exposed only during reload,
thus hoisting loads out of loops and doing inter-block scheduling needs
a separate optimization pass.
<br><dt><code>-fbranch-target-load-optimize2</code><dd><a name="index-fbranch_002dtarget_002dload_002doptimize2-1106"></a>Perform branch target register load optimization after prologue / epilogue
threading.
<br><dt><code>-fbtr-bb-exclusive</code><dd><a name="index-fbtr_002dbb_002dexclusive-1107"></a>When performing branch target register load optimization, don't reuse
branch target registers within any basic block.
<br><dt><code>-fstack-protector</code><dd><a name="index-fstack_002dprotector-1108"></a>Emit extra code to check for buffer overflows, such as stack smashing
attacks. This is done by adding a guard variable to functions with
vulnerable objects. This includes functions that call <code>alloca</code>, and
functions with buffers larger than 8 bytes. The guards are initialized
when a function is entered and then checked when the function exits.
If a guard check fails, an error message is printed and the program exits.
<br><dt><code>-fstack-protector-all</code><dd><a name="index-fstack_002dprotector_002dall-1109"></a>Like <samp><span class="option">-fstack-protector</span></samp> except that all functions are protected.
<br><dt><code>-fstack-protector-strong</code><dd><a name="index-fstack_002dprotector_002dstrong-1110"></a>Like <samp><span class="option">-fstack-protector</span></samp> but includes additional functions to
be protected — those that have local array definitions, or have
references to local frame addresses.
<br><dt><code>-fstack-protector-explicit</code><dd><a name="index-fstack_002dprotector_002dexplicit-1111"></a>Like <samp><span class="option">-fstack-protector</span></samp> but only protects those functions which
have the <code>stack_protect</code> attribute
<br><dt><code>-fstdarg-opt</code><dd><a name="index-fstdarg_002dopt-1112"></a>Optimize the prologue of variadic argument functions with respect to usage of
those arguments.
<br><dt><code>-fsection-anchors</code><dd><a name="index-fsection_002danchors-1113"></a>Try to reduce the number of symbolic address calculations by using
shared “anchor” symbols to address nearby objects. This transformation
can help to reduce the number of GOT entries and GOT accesses on some
targets.
<p>For example, the implementation of the following function <code>foo</code>:
<pre class="smallexample"> static int a, b, c;
int foo (void) { return a + b + c; }
</pre>
<p class="noindent">usually calculates the addresses of all three variables, but if you
compile it with <samp><span class="option">-fsection-anchors</span></samp>, it accesses the variables
from a common anchor point instead. The effect is similar to the
following pseudocode (which isn't valid C):
<pre class="smallexample"> int foo (void)
{
register int *xr = &x;
return xr[&a - &x] + xr[&b - &x] + xr[&c - &x];
}
</pre>
<p>Not all targets support this option.
<br><dt><code>--param </code><var>name</var><code>=</code><var>value</var><dd><a name="index-param-1114"></a>In some places, GCC uses various constants to control the amount of
optimization that is done. For example, GCC does not inline functions
that contain more than a certain number of instructions. You can
control some of these constants on the command line using the
<samp><span class="option">--param</span></samp> option.
<p>The names of specific parameters, and the meaning of the values, are
tied to the internals of the compiler, and are subject to change
without notice in future releases.
<p>In each case, the <var>value</var> is an integer. The allowable choices for
<var>name</var> are:
<dl>
<dt><code>predictable-branch-outcome</code><dd>When branch is predicted to be taken with probability lower than this threshold
(in percent), then it is considered well predictable. The default is 10.
<br><dt><code>max-crossjump-edges</code><dd>The maximum number of incoming edges to consider for cross-jumping.
The algorithm used by <samp><span class="option">-fcrossjumping</span></samp> is O(N^2) in
the number of edges incoming to each block. Increasing values mean
more aggressive optimization, making the compilation time increase with
probably small improvement in executable size.
<br><dt><code>min-crossjump-insns</code><dd>The minimum number of instructions that must be matched at the end
of two blocks before cross-jumping is performed on them. This
value is ignored in the case where all instructions in the block being
cross-jumped from are matched. The default value is 5.
<br><dt><code>max-grow-copy-bb-insns</code><dd>The maximum code size expansion factor when copying basic blocks
instead of jumping. The expansion is relative to a jump instruction.
The default value is 8.
<br><dt><code>max-goto-duplication-insns</code><dd>The maximum number of instructions to duplicate to a block that jumps
to a computed goto. To avoid O(N^2) behavior in a number of
passes, GCC factors computed gotos early in the compilation process,
and unfactors them as late as possible. Only computed jumps at the
end of a basic blocks with no more than max-goto-duplication-insns are
unfactored. The default value is 8.
<br><dt><code>max-delay-slot-insn-search</code><dd>The maximum number of instructions to consider when looking for an
instruction to fill a delay slot. If more than this arbitrary number of
instructions are searched, the time savings from filling the delay slot
are minimal, so stop searching. Increasing values mean more
aggressive optimization, making the compilation time increase with probably
small improvement in execution time.
<br><dt><code>max-delay-slot-live-search</code><dd>When trying to fill delay slots, the maximum number of instructions to
consider when searching for a block with valid live register
information. Increasing this arbitrarily chosen value means more
aggressive optimization, increasing the compilation time. This parameter
should be removed when the delay slot code is rewritten to maintain the
control-flow graph.
<br><dt><code>max-gcse-memory</code><dd>The approximate maximum amount of memory that can be allocated in
order to perform the global common subexpression elimination
optimization. If more memory than specified is required, the
optimization is not done.
<br><dt><code>max-gcse-insertion-ratio</code><dd>If the ratio of expression insertions to deletions is larger than this value
for any expression, then RTL PRE inserts or removes the expression and thus
leaves partially redundant computations in the instruction stream. The default value is 20.
<br><dt><code>max-pending-list-length</code><dd>The maximum number of pending dependencies scheduling allows
before flushing the current state and starting over. Large functions
with few branches or calls can create excessively large lists which
needlessly consume memory and resources.
<br><dt><code>max-modulo-backtrack-attempts</code><dd>The maximum number of backtrack attempts the scheduler should make
when modulo scheduling a loop. Larger values can exponentially increase
compilation time.
<br><dt><code>max-inline-insns-single</code><dd>Several parameters control the tree inliner used in GCC.
This number sets the maximum number of instructions (counted in GCC's
internal representation) in a single function that the tree inliner
considers for inlining. This only affects functions declared
inline and methods implemented in a class declaration (C++).
The default value is 400.
<br><dt><code>max-inline-insns-auto</code><dd>When you use <samp><span class="option">-finline-functions</span></samp> (included in <samp><span class="option">-O3</span></samp>),
a lot of functions that would otherwise not be considered for inlining
by the compiler are investigated. To those functions, a different
(more restrictive) limit compared to functions declared inline can
be applied.
The default value is 40.
<br><dt><code>inline-min-speedup</code><dd>When estimated performance improvement of caller + callee runtime exceeds this
threshold (in precent), the function can be inlined regardless the limit on
<samp><span class="option">--param max-inline-insns-single</span></samp> and <samp><span class="option">--param
max-inline-insns-auto</span></samp>.
<br><dt><code>large-function-insns</code><dd>The limit specifying really large functions. For functions larger than this
limit after inlining, inlining is constrained by
<samp><span class="option">--param large-function-growth</span></samp>. This parameter is useful primarily
to avoid extreme compilation time caused by non-linear algorithms used by the
back end.
The default value is 2700.
<br><dt><code>large-function-growth</code><dd>Specifies maximal growth of large function caused by inlining in percents.
The default value is 100 which limits large function growth to 2.0 times
the original size.
<br><dt><code>large-unit-insns</code><dd>The limit specifying large translation unit. Growth caused by inlining of
units larger than this limit is limited by <samp><span class="option">--param inline-unit-growth</span></samp>.
For small units this might be too tight.
For example, consider a unit consisting of function A
that is inline and B that just calls A three times. If B is small relative to
A, the growth of unit is 300\% and yet such inlining is very sane. For very
large units consisting of small inlineable functions, however, the overall unit
growth limit is needed to avoid exponential explosion of code size. Thus for
smaller units, the size is increased to <samp><span class="option">--param large-unit-insns</span></samp>
before applying <samp><span class="option">--param inline-unit-growth</span></samp>. The default is 10000.
<br><dt><code>inline-unit-growth</code><dd>Specifies maximal overall growth of the compilation unit caused by inlining.
The default value is 20 which limits unit growth to 1.2 times the original
size. Cold functions (either marked cold via an attribute or by profile
feedback) are not accounted into the unit size.
<br><dt><code>ipcp-unit-growth</code><dd>Specifies maximal overall growth of the compilation unit caused by
interprocedural constant propagation. The default value is 10 which limits
unit growth to 1.1 times the original size.
<br><dt><code>large-stack-frame</code><dd>The limit specifying large stack frames. While inlining the algorithm is trying
to not grow past this limit too much. The default value is 256 bytes.
<br><dt><code>large-stack-frame-growth</code><dd>Specifies maximal growth of large stack frames caused by inlining in percents.
The default value is 1000 which limits large stack frame growth to 11 times
the original size.
<br><dt><code>max-inline-insns-recursive</code><dt><code>max-inline-insns-recursive-auto</code><dd>Specifies the maximum number of instructions an out-of-line copy of a
self-recursive inline
function can grow into by performing recursive inlining.
<p><samp><span class="option">--param max-inline-insns-recursive</span></samp> applies to functions
declared inline.
For functions not declared inline, recursive inlining
happens only when <samp><span class="option">-finline-functions</span></samp> (included in <samp><span class="option">-O3</span></samp>) is
enabled; <samp><span class="option">--param max-inline-insns-recursive-auto</span></samp> applies instead. The
default value is 450.
<br><dt><code>max-inline-recursive-depth</code><dt><code>max-inline-recursive-depth-auto</code><dd>Specifies the maximum recursion depth used for recursive inlining.
<p><samp><span class="option">--param max-inline-recursive-depth</span></samp> applies to functions
declared inline. For functions not declared inline, recursive inlining
happens only when <samp><span class="option">-finline-functions</span></samp> (included in <samp><span class="option">-O3</span></samp>) is
enabled; <samp><span class="option">--param max-inline-recursive-depth-auto</span></samp> applies instead. The
default value is 8.
<br><dt><code>min-inline-recursive-probability</code><dd>Recursive inlining is profitable only for function having deep recursion
in average and can hurt for function having little recursion depth by
increasing the prologue size or complexity of function body to other
optimizers.
<p>When profile feedback is available (see <samp><span class="option">-fprofile-generate</span></samp>) the actual
recursion depth can be guessed from probability that function recurses via a
given call expression. This parameter limits inlining only to call expressions
whose probability exceeds the given threshold (in percents).
The default value is 10.
<br><dt><code>early-inlining-insns</code><dd>Specify growth that the early inliner can make. In effect it increases
the amount of inlining for code having a large abstraction penalty.
The default value is 14.
<br><dt><code>max-early-inliner-iterations</code><dd>Limit of iterations of the early inliner. This basically bounds
the number of nested indirect calls the early inliner can resolve.
Deeper chains are still handled by late inlining.
<br><dt><code>comdat-sharing-probability</code><dd>Probability (in percent) that C++ inline function with comdat visibility
are shared across multiple compilation units. The default value is 20.
<br><dt><code>profile-func-internal-id</code><dd>A parameter to control whether to use function internal id in profile
database lookup. If the value is 0, the compiler uses an id that
is based on function assembler name and filename, which makes old profile
data more tolerant to source changes such as function reordering etc.
The default value is 0.
<br><dt><code>min-vect-loop-bound</code><dd>The minimum number of iterations under which loops are not vectorized
when <samp><span class="option">-ftree-vectorize</span></samp> is used. The number of iterations after
vectorization needs to be greater than the value specified by this option
to allow vectorization. The default value is 0.
<br><dt><code>gcse-cost-distance-ratio</code><dd>Scaling factor in calculation of maximum distance an expression
can be moved by GCSE optimizations. This is currently supported only in the
code hoisting pass. The bigger the ratio, the more aggressive code hoisting
is with simple expressions, i.e., the expressions that have cost
less than <samp><span class="option">gcse-unrestricted-cost</span></samp>. Specifying 0 disables
hoisting of simple expressions. The default value is 10.
<br><dt><code>gcse-unrestricted-cost</code><dd>Cost, roughly measured as the cost of a single typical machine
instruction, at which GCSE optimizations do not constrain
the distance an expression can travel. This is currently
supported only in the code hoisting pass. The lesser the cost,
the more aggressive code hoisting is. Specifying 0
allows all expressions to travel unrestricted distances.
The default value is 3.
<br><dt><code>max-hoist-depth</code><dd>The depth of search in the dominator tree for expressions to hoist.
This is used to avoid quadratic behavior in hoisting algorithm.
The value of 0 does not limit on the search, but may slow down compilation
of huge functions. The default value is 30.
<br><dt><code>max-tail-merge-comparisons</code><dd>The maximum amount of similar bbs to compare a bb with. This is used to
avoid quadratic behavior in tree tail merging. The default value is 10.
<br><dt><code>max-tail-merge-iterations</code><dd>The maximum amount of iterations of the pass over the function. This is used to
limit compilation time in tree tail merging. The default value is 2.
<br><dt><code>max-unrolled-insns</code><dd>The maximum number of instructions that a loop may have to be unrolled.
If a loop is unrolled, this parameter also determines how many times
the loop code is unrolled.
<br><dt><code>max-average-unrolled-insns</code><dd>The maximum number of instructions biased by probabilities of their execution
that a loop may have to be unrolled. If a loop is unrolled,
this parameter also determines how many times the loop code is unrolled.
<br><dt><code>max-unroll-times</code><dd>The maximum number of unrollings of a single loop.
<br><dt><code>max-peeled-insns</code><dd>The maximum number of instructions that a loop may have to be peeled.
If a loop is peeled, this parameter also determines how many times
the loop code is peeled.
<br><dt><code>max-peel-times</code><dd>The maximum number of peelings of a single loop.
<br><dt><code>max-peel-branches</code><dd>The maximum number of branches on the hot path through the peeled sequence.
<br><dt><code>max-completely-peeled-insns</code><dd>The maximum number of insns of a completely peeled loop.
<br><dt><code>max-completely-peel-times</code><dd>The maximum number of iterations of a loop to be suitable for complete peeling.
<br><dt><code>max-completely-peel-loop-nest-depth</code><dd>The maximum depth of a loop nest suitable for complete peeling.
<br><dt><code>max-unswitch-insns</code><dd>The maximum number of insns of an unswitched loop.
<br><dt><code>max-unswitch-level</code><dd>The maximum number of branches unswitched in a single loop.
<br><dt><code>lim-expensive</code><dd>The minimum cost of an expensive expression in the loop invariant motion.
<br><dt><code>iv-consider-all-candidates-bound</code><dd>Bound on number of candidates for induction variables, below which
all candidates are considered for each use in induction variable
optimizations. If there are more candidates than this,
only the most relevant ones are considered to avoid quadratic time complexity.
<br><dt><code>iv-max-considered-uses</code><dd>The induction variable optimizations give up on loops that contain more
induction variable uses.
<br><dt><code>iv-always-prune-cand-set-bound</code><dd>If the number of candidates in the set is smaller than this value,
always try to remove unnecessary ivs from the set
when adding a new one.
<br><dt><code>scev-max-expr-size</code><dd>Bound on size of expressions used in the scalar evolutions analyzer.
Large expressions slow the analyzer.
<br><dt><code>scev-max-expr-complexity</code><dd>Bound on the complexity of the expressions in the scalar evolutions analyzer.
Complex expressions slow the analyzer.
<br><dt><code>omega-max-vars</code><dd>The maximum number of variables in an Omega constraint system.
The default value is 128.
<br><dt><code>omega-max-geqs</code><dd>The maximum number of inequalities in an Omega constraint system.
The default value is 256.
<br><dt><code>omega-max-eqs</code><dd>The maximum number of equalities in an Omega constraint system.
The default value is 128.
<br><dt><code>omega-max-wild-cards</code><dd>The maximum number of wildcard variables that the Omega solver is
able to insert. The default value is 18.
<br><dt><code>omega-hash-table-size</code><dd>The size of the hash table in the Omega solver. The default value is
550.
<br><dt><code>omega-max-keys</code><dd>The maximal number of keys used by the Omega solver. The default
value is 500.
<br><dt><code>omega-eliminate-redundant-constraints</code><dd>When set to 1, use expensive methods to eliminate all redundant
constraints. The default value is 0.
<br><dt><code>vect-max-version-for-alignment-checks</code><dd>The maximum number of run-time checks that can be performed when
doing loop versioning for alignment in the vectorizer.
<br><dt><code>vect-max-version-for-alias-checks</code><dd>The maximum number of run-time checks that can be performed when
doing loop versioning for alias in the vectorizer.
<br><dt><code>vect-max-peeling-for-alignment</code><dd>The maximum number of loop peels to enhance access alignment
for vectorizer. Value -1 means 'no limit'.
<br><dt><code>max-iterations-to-track</code><dd>The maximum number of iterations of a loop the brute-force algorithm
for analysis of the number of iterations of the loop tries to evaluate.
<br><dt><code>hot-bb-count-ws-permille</code><dd>A basic block profile count is considered hot if it contributes to
the given permillage (i.e. 0...1000) of the entire profiled execution.
<br><dt><code>hot-bb-frequency-fraction</code><dd>Select fraction of the entry block frequency of executions of basic block in
function given basic block needs to have to be considered hot.
<br><dt><code>max-predicted-iterations</code><dd>The maximum number of loop iterations we predict statically. This is useful
in cases where a function contains a single loop with known bound and
another loop with unknown bound.
The known number of iterations is predicted correctly, while
the unknown number of iterations average to roughly 10. This means that the
loop without bounds appears artificially cold relative to the other one.
<br><dt><code>builtin-expect-probability</code><dd>Control the probability of the expression having the specified value. This
parameter takes a percentage (i.e. 0 ... 100) as input.
The default probability of 90 is obtained empirically.
<br><dt><code>align-threshold</code><dd>
Select fraction of the maximal frequency of executions of a basic block in
a function to align the basic block.
<br><dt><code>align-loop-iterations</code><dd>
A loop expected to iterate at least the selected number of iterations is
aligned.
<br><dt><code>tracer-dynamic-coverage</code><dt><code>tracer-dynamic-coverage-feedback</code><dd>
This value is used to limit superblock formation once the given percentage of
executed instructions is covered. This limits unnecessary code size
expansion.
<p>The <samp><span class="option">tracer-dynamic-coverage-feedback</span></samp> parameter
is used only when profile
feedback is available. The real profiles (as opposed to statically estimated
ones) are much less balanced allowing the threshold to be larger value.
<br><dt><code>tracer-max-code-growth</code><dd>Stop tail duplication once code growth has reached given percentage. This is
a rather artificial limit, as most of the duplicates are eliminated later in
cross jumping, so it may be set to much higher values than is the desired code
growth.
<br><dt><code>tracer-min-branch-ratio</code><dd>
Stop reverse growth when the reverse probability of best edge is less than this
threshold (in percent).
<br><dt><code>tracer-min-branch-ratio</code><dt><code>tracer-min-branch-ratio-feedback</code><dd>
Stop forward growth if the best edge has probability lower than this
threshold.
<p>Similarly to <samp><span class="option">tracer-dynamic-coverage</span></samp> two values are present, one for
compilation for profile feedback and one for compilation without. The value
for compilation with profile feedback needs to be more conservative (higher) in
order to make tracer effective.
<br><dt><code>max-cse-path-length</code><dd>
The maximum number of basic blocks on path that CSE considers.
The default is 10.
<br><dt><code>max-cse-insns</code><dd>The maximum number of instructions CSE processes before flushing.
The default is 1000.
<br><dt><code>ggc-min-expand</code><dd>
GCC uses a garbage collector to manage its own memory allocation. This
parameter specifies the minimum percentage by which the garbage
collector's heap should be allowed to expand between collections.
Tuning this may improve compilation speed; it has no effect on code
generation.
<p>The default is 30% + 70% * (RAM/1GB) with an upper bound of 100% when
RAM >= 1GB. If <code>getrlimit</code> is available, the notion of “RAM” is
the smallest of actual RAM and <code>RLIMIT_DATA</code> or <code>RLIMIT_AS</code>. If
GCC is not able to calculate RAM on a particular platform, the lower
bound of 30% is used. Setting this parameter and
<samp><span class="option">ggc-min-heapsize</span></samp> to zero causes a full collection to occur at
every opportunity. This is extremely slow, but can be useful for
debugging.
<br><dt><code>ggc-min-heapsize</code><dd>
Minimum size of the garbage collector's heap before it begins bothering
to collect garbage. The first collection occurs after the heap expands
by <samp><span class="option">ggc-min-expand</span></samp>% beyond <samp><span class="option">ggc-min-heapsize</span></samp>. Again,
tuning this may improve compilation speed, and has no effect on code
generation.
<p>The default is the smaller of RAM/8, RLIMIT_RSS, or a limit that
tries to ensure that RLIMIT_DATA or RLIMIT_AS are not exceeded, but
with a lower bound of 4096 (four megabytes) and an upper bound of
131072 (128 megabytes). If GCC is not able to calculate RAM on a
particular platform, the lower bound is used. Setting this parameter
very large effectively disables garbage collection. Setting this
parameter and <samp><span class="option">ggc-min-expand</span></samp> to zero causes a full collection
to occur at every opportunity.
<br><dt><code>max-reload-search-insns</code><dd>The maximum number of instruction reload should look backward for equivalent
register. Increasing values mean more aggressive optimization, making the
compilation time increase with probably slightly better performance.
The default value is 100.
<br><dt><code>max-cselib-memory-locations</code><dd>The maximum number of memory locations cselib should take into account.
Increasing values mean more aggressive optimization, making the compilation time
increase with probably slightly better performance. The default value is 500.
<br><dt><code>reorder-blocks-duplicate</code><dt><code>reorder-blocks-duplicate-feedback</code><dd>
Used by the basic block reordering pass to decide whether to use unconditional
branch or duplicate the code on its destination. Code is duplicated when its
estimated size is smaller than this value multiplied by the estimated size of
unconditional jump in the hot spots of the program.
<p>The <samp><span class="option">reorder-block-duplicate-feedback</span></samp> parameter
is used only when profile
feedback is available. It may be set to higher values than
<samp><span class="option">reorder-block-duplicate</span></samp> since information about the hot spots is more
accurate.
<br><dt><code>max-sched-ready-insns</code><dd>The maximum number of instructions ready to be issued the scheduler should
consider at any given time during the first scheduling pass. Increasing
values mean more thorough searches, making the compilation time increase
with probably little benefit. The default value is 100.
<br><dt><code>max-sched-region-blocks</code><dd>The maximum number of blocks in a region to be considered for
interblock scheduling. The default value is 10.
<br><dt><code>max-pipeline-region-blocks</code><dd>The maximum number of blocks in a region to be considered for
pipelining in the selective scheduler. The default value is 15.
<br><dt><code>max-sched-region-insns</code><dd>The maximum number of insns in a region to be considered for
interblock scheduling. The default value is 100.
<br><dt><code>max-pipeline-region-insns</code><dd>The maximum number of insns in a region to be considered for
pipelining in the selective scheduler. The default value is 200.
<br><dt><code>min-spec-prob</code><dd>The minimum probability (in percents) of reaching a source block
for interblock speculative scheduling. The default value is 40.
<br><dt><code>max-sched-extend-regions-iters</code><dd>The maximum number of iterations through CFG to extend regions.
A value of 0 (the default) disables region extensions.
<br><dt><code>max-sched-insn-conflict-delay</code><dd>The maximum conflict delay for an insn to be considered for speculative motion.
The default value is 3.
<br><dt><code>sched-spec-prob-cutoff</code><dd>The minimal probability of speculation success (in percents), so that
speculative insns are scheduled.
The default value is 40.
<br><dt><code>sched-spec-state-edge-prob-cutoff</code><dd>The minimum probability an edge must have for the scheduler to save its
state across it.
The default value is 10.
<br><dt><code>sched-mem-true-dep-cost</code><dd>Minimal distance (in CPU cycles) between store and load targeting same
memory locations. The default value is 1.
<br><dt><code>selsched-max-lookahead</code><dd>The maximum size of the lookahead window of selective scheduling. It is a
depth of search for available instructions.
The default value is 50.
<br><dt><code>selsched-max-sched-times</code><dd>The maximum number of times that an instruction is scheduled during
selective scheduling. This is the limit on the number of iterations
through which the instruction may be pipelined. The default value is 2.
<br><dt><code>selsched-max-insns-to-rename</code><dd>The maximum number of best instructions in the ready list that are considered
for renaming in the selective scheduler. The default value is 2.
<br><dt><code>sms-min-sc</code><dd>The minimum value of stage count that swing modulo scheduler
generates. The default value is 2.
<br><dt><code>max-last-value-rtl</code><dd>The maximum size measured as number of RTLs that can be recorded in an expression
in combiner for a pseudo register as last known value of that register. The default
is 10000.
<br><dt><code>max-combine-insns</code><dd>The maximum number of instructions the RTL combiner tries to combine.
The default value is 2 at <samp><span class="option">-Og</span></samp> and 4 otherwise.
<br><dt><code>integer-share-limit</code><dd>Small integer constants can use a shared data structure, reducing the
compiler's memory usage and increasing its speed. This sets the maximum
value of a shared integer constant. The default value is 256.
<br><dt><code>ssp-buffer-size</code><dd>The minimum size of buffers (i.e. arrays) that receive stack smashing
protection when <samp><span class="option">-fstack-protection</span></samp> is used.
<br><dt><code>min-size-for-stack-sharing</code><dd>The minimum size of variables taking part in stack slot sharing when not
optimizing. The default value is 32.
<br><dt><code>max-jump-thread-duplication-stmts</code><dd>Maximum number of statements allowed in a block that needs to be
duplicated when threading jumps.
<br><dt><code>max-fields-for-field-sensitive</code><dd>Maximum number of fields in a structure treated in
a field sensitive manner during pointer analysis. The default is zero
for <samp><span class="option">-O0</span></samp> and <samp><span class="option">-O1</span></samp>,
and 100 for <samp><span class="option">-Os</span></samp>, <samp><span class="option">-O2</span></samp>, and <samp><span class="option">-O3</span></samp>.
<br><dt><code>prefetch-latency</code><dd>Estimate on average number of instructions that are executed before
prefetch finishes. The distance prefetched ahead is proportional
to this constant. Increasing this number may also lead to less
streams being prefetched (see <samp><span class="option">simultaneous-prefetches</span></samp>).
<br><dt><code>simultaneous-prefetches</code><dd>Maximum number of prefetches that can run at the same time.
<br><dt><code>l1-cache-line-size</code><dd>The size of cache line in L1 cache, in bytes.
<br><dt><code>l1-cache-size</code><dd>The size of L1 cache, in kilobytes.
<br><dt><code>l2-cache-size</code><dd>The size of L2 cache, in kilobytes.
<br><dt><code>min-insn-to-prefetch-ratio</code><dd>The minimum ratio between the number of instructions and the
number of prefetches to enable prefetching in a loop.
<br><dt><code>prefetch-min-insn-to-mem-ratio</code><dd>The minimum ratio between the number of instructions and the
number of memory references to enable prefetching in a loop.
<br><dt><code>use-canonical-types</code><dd>Whether the compiler should use the “canonical” type system. By
default, this should always be 1, which uses a more efficient internal
mechanism for comparing types in C++ and Objective-C++. However, if
bugs in the canonical type system are causing compilation failures,
set this value to 0 to disable canonical types.
<br><dt><code>switch-conversion-max-branch-ratio</code><dd>Switch initialization conversion refuses to create arrays that are
bigger than <samp><span class="option">switch-conversion-max-branch-ratio</span></samp> times the number of
branches in the switch.
<br><dt><code>max-partial-antic-length</code><dd>Maximum length of the partial antic set computed during the tree
partial redundancy elimination optimization (<samp><span class="option">-ftree-pre</span></samp>) when
optimizing at <samp><span class="option">-O3</span></samp> and above. For some sorts of source code
the enhanced partial redundancy elimination optimization can run away,
consuming all of the memory available on the host machine. This
parameter sets a limit on the length of the sets that are computed,
which prevents the runaway behavior. Setting a value of 0 for
this parameter allows an unlimited set length.
<br><dt><code>sccvn-max-scc-size</code><dd>Maximum size of a strongly connected component (SCC) during SCCVN
processing. If this limit is hit, SCCVN processing for the whole
function is not done and optimizations depending on it are
disabled. The default maximum SCC size is 10000.
<br><dt><code>sccvn-max-alias-queries-per-access</code><dd>Maximum number of alias-oracle queries we perform when looking for
redundancies for loads and stores. If this limit is hit the search
is aborted and the load or store is not considered redundant. The
number of queries is algorithmically limited to the number of
stores on all paths from the load to the function entry.
The default maxmimum number of queries is 1000.
<br><dt><code>ira-max-loops-num</code><dd>IRA uses regional register allocation by default. If a function
contains more loops than the number given by this parameter, only at most
the given number of the most frequently-executed loops form regions
for regional register allocation. The default value of the
parameter is 100.
<br><dt><code>ira-max-conflict-table-size</code><dd>Although IRA uses a sophisticated algorithm to compress the conflict
table, the table can still require excessive amounts of memory for
huge functions. If the conflict table for a function could be more
than the size in MB given by this parameter, the register allocator
instead uses a faster, simpler, and lower-quality
algorithm that does not require building a pseudo-register conflict table.
The default value of the parameter is 2000.
<br><dt><code>ira-loop-reserved-regs</code><dd>IRA can be used to evaluate more accurate register pressure in loops
for decisions to move loop invariants (see <samp><span class="option">-O3</span></samp>). The number
of available registers reserved for some other purposes is given
by this parameter. The default value of the parameter is 2, which is
the minimal number of registers needed by typical instructions.
This value is the best found from numerous experiments.
<br><dt><code>lra-inheritance-ebb-probability-cutoff</code><dd>LRA tries to reuse values reloaded in registers in subsequent insns.
This optimization is called inheritance. EBB is used as a region to
do this optimization. The parameter defines a minimal fall-through
edge probability in percentage used to add BB to inheritance EBB in
LRA. The default value of the parameter is 40. The value was chosen
from numerous runs of SPEC2000 on x86-64.
<br><dt><code>loop-invariant-max-bbs-in-loop</code><dd>Loop invariant motion can be very expensive, both in compilation time and
in amount of needed compile-time memory, with very large loops. Loops
with more basic blocks than this parameter won't have loop invariant
motion optimization performed on them. The default value of the
parameter is 1000 for <samp><span class="option">-O1</span></samp> and 10000 for <samp><span class="option">-O2</span></samp> and above.
<br><dt><code>loop-max-datarefs-for-datadeps</code><dd>Building data dapendencies is expensive for very large loops. This
parameter limits the number of data references in loops that are
considered for data dependence analysis. These large loops are no
handled by the optimizations using loop data dependencies.
The default value is 1000.
<br><dt><code>max-vartrack-size</code><dd>Sets a maximum number of hash table slots to use during variable
tracking dataflow analysis of any function. If this limit is exceeded
with variable tracking at assignments enabled, analysis for that
function is retried without it, after removing all debug insns from
the function. If the limit is exceeded even without debug insns, var
tracking analysis is completely disabled for the function. Setting
the parameter to zero makes it unlimited.
<br><dt><code>max-vartrack-expr-depth</code><dd>Sets a maximum number of recursion levels when attempting to map
variable names or debug temporaries to value expressions. This trades
compilation time for more complete debug information. If this is set too
low, value expressions that are available and could be represented in
debug information may end up not being used; setting this higher may
enable the compiler to find more complex debug expressions, but compile
time and memory use may grow. The default is 12.
<br><dt><code>min-nondebug-insn-uid</code><dd>Use uids starting at this parameter for nondebug insns. The range below
the parameter is reserved exclusively for debug insns created by
<samp><span class="option">-fvar-tracking-assignments</span></samp>, but debug insns may get
(non-overlapping) uids above it if the reserved range is exhausted.
<br><dt><code>ipa-sra-ptr-growth-factor</code><dd>IPA-SRA replaces a pointer to an aggregate with one or more new
parameters only when their cumulative size is less or equal to
<samp><span class="option">ipa-sra-ptr-growth-factor</span></samp> times the size of the original
pointer parameter.
<br><dt><code>sra-max-scalarization-size-Ospeed</code><br><dt><code>sra-max-scalarization-size-Osize</code><dd>The two Scalar Reduction of Aggregates passes (SRA and IPA-SRA) aim to
replace scalar parts of aggregates with uses of independent scalar
variables. These parameters control the maximum size, in storage units,
of aggregate which is considered for replacement when compiling for
speed
(<samp><span class="option">sra-max-scalarization-size-Ospeed</span></samp>) or size
(<samp><span class="option">sra-max-scalarization-size-Osize</span></samp>) respectively.
<br><dt><code>tm-max-aggregate-size</code><dd>When making copies of thread-local variables in a transaction, this
parameter specifies the size in bytes after which variables are
saved with the logging functions as opposed to save/restore code
sequence pairs. This option only applies when using
<samp><span class="option">-fgnu-tm</span></samp>.
<br><dt><code>graphite-max-nb-scop-params</code><dd>To avoid exponential effects in the Graphite loop transforms, the
number of parameters in a Static Control Part (SCoP) is bounded. The
default value is 10 parameters. A variable whose value is unknown at
compilation time and defined outside a SCoP is a parameter of the SCoP.
<br><dt><code>graphite-max-bbs-per-function</code><dd>To avoid exponential effects in the detection of SCoPs, the size of
the functions analyzed by Graphite is bounded. The default value is
100 basic blocks.
<br><dt><code>loop-block-tile-size</code><dd>Loop blocking or strip mining transforms, enabled with
<samp><span class="option">-floop-block</span></samp> or <samp><span class="option">-floop-strip-mine</span></samp>, strip mine each
loop in the loop nest by a given number of iterations. The strip
length can be changed using the <samp><span class="option">loop-block-tile-size</span></samp>
parameter. The default value is 51 iterations.
<br><dt><code>loop-unroll-jam-size</code><dd>Specify the unroll factor for the <samp><span class="option">-floop-unroll-and-jam</span></samp> option. The
default value is 4.
<br><dt><code>loop-unroll-jam-depth</code><dd>Specify the dimension to be unrolled (counting from the most inner loop)
for the <samp><span class="option">-floop-unroll-and-jam</span></samp>. The default value is 2.
<br><dt><code>ipa-cp-value-list-size</code><dd>IPA-CP attempts to track all possible values and types passed to a function's
parameter in order to propagate them and perform devirtualization.
<samp><span class="option">ipa-cp-value-list-size</span></samp> is the maximum number of values and types it
stores per one formal parameter of a function.
<br><dt><code>ipa-cp-eval-threshold</code><dd>IPA-CP calculates its own score of cloning profitability heuristics
and performs those cloning opportunities with scores that exceed
<samp><span class="option">ipa-cp-eval-threshold</span></samp>.
<br><dt><code>ipa-cp-recursion-penalty</code><dd>Percentage penalty the recursive functions will receive when they
are evaluated for cloning.
<br><dt><code>ipa-cp-single-call-penalty</code><dd>Percentage penalty functions containg a single call to another
function will receive when they are evaluated for cloning.
<br><dt><code>ipa-max-agg-items</code><dd>IPA-CP is also capable to propagate a number of scalar values passed
in an aggregate. <samp><span class="option">ipa-max-agg-items</span></samp> controls the maximum
number of such values per one parameter.
<br><dt><code>ipa-cp-loop-hint-bonus</code><dd>When IPA-CP determines that a cloning candidate would make the number
of iterations of a loop known, it adds a bonus of
<samp><span class="option">ipa-cp-loop-hint-bonus</span></samp> to the profitability score of
the candidate.
<br><dt><code>ipa-cp-array-index-hint-bonus</code><dd>When IPA-CP determines that a cloning candidate would make the index of
an array access known, it adds a bonus of
<samp><span class="option">ipa-cp-array-index-hint-bonus</span></samp> to the profitability
score of the candidate.
<br><dt><code>ipa-max-aa-steps</code><dd>During its analysis of function bodies, IPA-CP employs alias analysis
in order to track values pointed to by function parameters. In order
not spend too much time analyzing huge functions, it gives up and
consider all memory clobbered after examining
<samp><span class="option">ipa-max-aa-steps</span></samp> statements modifying memory.
<br><dt><code>lto-partitions</code><dd>Specify desired number of partitions produced during WHOPR compilation.
The number of partitions should exceed the number of CPUs used for compilation.
The default value is 32.
<br><dt><code>lto-minpartition</code><dd>Size of minimal partition for WHOPR (in estimated instructions).
This prevents expenses of splitting very small programs into too many
partitions.
<br><dt><code>cxx-max-namespaces-for-diagnostic-help</code><dd>The maximum number of namespaces to consult for suggestions when C++
name lookup fails for an identifier. The default is 1000.
<br><dt><code>sink-frequency-threshold</code><dd>The maximum relative execution frequency (in percents) of the target block
relative to a statement's original block to allow statement sinking of a
statement. Larger numbers result in more aggressive statement sinking.
The default value is 75. A small positive adjustment is applied for
statements with memory operands as those are even more profitable so sink.
<br><dt><code>max-stores-to-sink</code><dd>The maximum number of conditional stores paires that can be sunk. Set to 0
if either vectorization (<samp><span class="option">-ftree-vectorize</span></samp>) or if-conversion
(<samp><span class="option">-ftree-loop-if-convert</span></samp>) is disabled. The default is 2.
<br><dt><code>allow-store-data-races</code><dd>Allow optimizers to introduce new data races on stores.
Set to 1 to allow, otherwise to 0. This option is enabled by default
at optimization level <samp><span class="option">-Ofast</span></samp>.
<br><dt><code>case-values-threshold</code><dd>The smallest number of different values for which it is best to use a
jump-table instead of a tree of conditional branches. If the value is
0, use the default for the machine. The default is 0.
<br><dt><code>tree-reassoc-width</code><dd>Set the maximum number of instructions executed in parallel in
reassociated tree. This parameter overrides target dependent
heuristics used by default if has non zero value.
<br><dt><code>sched-pressure-algorithm</code><dd>Choose between the two available implementations of
<samp><span class="option">-fsched-pressure</span></samp>. Algorithm 1 is the original implementation
and is the more likely to prevent instructions from being reordered.
Algorithm 2 was designed to be a compromise between the relatively
conservative approach taken by algorithm 1 and the rather aggressive
approach taken by the default scheduler. It relies more heavily on
having a regular register file and accurate register pressure classes.
See <samp><span class="file">haifa-sched.c</span></samp> in the GCC sources for more details.
<p>The default choice depends on the target.
<br><dt><code>max-slsr-cand-scan</code><dd>Set the maximum number of existing candidates that are considered when
seeking a basis for a new straight-line strength reduction candidate.
<br><dt><code>asan-globals</code><dd>Enable buffer overflow detection for global objects. This kind
of protection is enabled by default if you are using
<samp><span class="option">-fsanitize=address</span></samp> option.
To disable global objects protection use <samp><span class="option">--param asan-globals=0</span></samp>.
<br><dt><code>asan-stack</code><dd>Enable buffer overflow detection for stack objects. This kind of
protection is enabled by default when using<samp><span class="option">-fsanitize=address</span></samp>.
To disable stack protection use <samp><span class="option">--param asan-stack=0</span></samp> option.
<br><dt><code>asan-instrument-reads</code><dd>Enable buffer overflow detection for memory reads. This kind of
protection is enabled by default when using <samp><span class="option">-fsanitize=address</span></samp>.
To disable memory reads protection use
<samp><span class="option">--param asan-instrument-reads=0</span></samp>.
<br><dt><code>asan-instrument-writes</code><dd>Enable buffer overflow detection for memory writes. This kind of
protection is enabled by default when using <samp><span class="option">-fsanitize=address</span></samp>.
To disable memory writes protection use
<samp><span class="option">--param asan-instrument-writes=0</span></samp> option.
<br><dt><code>asan-memintrin</code><dd>Enable detection for built-in functions. This kind of protection
is enabled by default when using <samp><span class="option">-fsanitize=address</span></samp>.
To disable built-in functions protection use
<samp><span class="option">--param asan-memintrin=0</span></samp>.
<br><dt><code>asan-use-after-return</code><dd>Enable detection of use-after-return. This kind of protection
is enabled by default when using <samp><span class="option">-fsanitize=address</span></samp> option.
To disable use-after-return detection use
<samp><span class="option">--param asan-use-after-return=0</span></samp>.
<br><dt><code>asan-instrumentation-with-call-threshold</code><dd>If number of memory accesses in function being instrumented
is greater or equal to this number, use callbacks instead of inline checks.
E.g. to disable inline code use
<samp><span class="option">--param asan-instrumentation-with-call-threshold=0</span></samp>.
<br><dt><code>chkp-max-ctor-size</code><dd>Static constructors generated by Pointer Bounds Checker may become very
large and significantly increase compile time at optimization level
<samp><span class="option">-O1</span></samp> and higher. This parameter is a maximum nubmer of statements
in a single generated constructor. Default value is 5000.
<br><dt><code>max-fsm-thread-path-insns</code><dd>Maximum number of instructions to copy when duplicating blocks on a
finite state automaton jump thread path. The default is 100.
<br><dt><code>max-fsm-thread-length</code><dd>Maximum number of basic blocks on a finite state automaton jump thread
path. The default is 10.
<br><dt><code>max-fsm-thread-paths</code><dd>Maximum number of new jump thread paths to create for a finite state
automaton. The default is 50.
</dl>
</dl>
</body></html>
```
|
The following is an episode list for Aardman Animations' children's comedy series, Shaun the Sheep, in chronological order of first airing on BBC One & CBBC.
Series overview
Regular series
Films
Special series
Episodes
Series 1 (2007)
Series 1 utilized single frame recording with an SDTV professional video camera to create the animation.
Series 2 (2009–2010)
Series 2 consists of 40 episodes and commenced airing in the United Kingdom on 23 November 2009 on BBC One and BBC HD. It had already started airing in Germany on 18 October 2009. The series director was Chris Sadler. This series was shot with digital camera still images that were edited into high definition video. There were major changes in the looks of the characters (e.g., The pigs are slimmer, Timmy’s mom’s eyes are big and connected the Farmer now has molded line separating his stubble and Bitzer and Pidsley now have detailed fur), the bull was absent throughout the whole series and the title sequence was also adapted to reflect these changes.
Series 3 (2012)
Bitzer has now returned to using his Series 1 model, and Pidsley was removed starting this season. The farmer's glasses are now square shaped and the bull has returned. Also, the theme song is rerecorded by Mark Thomas and Vic Reeves. All episodes premiered in Germany on the KiKa channel between 30 November and 9 December 2012. In the UK, it ran between 25 February 2013 and 21 March 2013 on the CBBC channel.
Series 4 (2013)
The first 20 episodes of the fourth series, consisting of 30 episodes in total, began airing on 3 February 2014 on CBBC and internationally on 4 February 2013.
Another ten episodes began airing on the Australian television channel ABC3 starting from 17 September 2014. These episodes began on CBBC on 8 December 2014.
Series 5 (2016)
A fifth series began airing on CBBC on 5 September 2016. The season first aired in the Netherlands from 1 December 2015 to 1 January 2016 and in Australia in January 2016. Mark Thomas and Vic Reeves once again rerecorded the theme tune, as it sounds slightly different.
Series 6 (2020)
A sixth series began airing in 2020 on Netflix under the subtitle Adventures from Mossy Bottom. The series reduced appearances of The Bull, extending The Goat's appearance instead. New characters appeared in the series, such as Stash the Squirrel and Farmer Ben, which also affected the new intro – now with three various variants. This was the only time throughout the series for all 20 episodes to be aired on the same day, 16 March 2020. CBBC, BBC iPlayer and BBC Two aired the episodes starting in September 2022.
Specials (2015–2021)
Films
Other broadcasts
Shaun the Sheep 3D (2012)
The following is a list of 1-minute stereoscopic 3D shorts created by Aardman for the Nintendo 3DS' Nintendo Video service. These shorts used a slightly smaller team of artists than the main series. Beginning on 15 January 2016, the shorts were released every Friday on the official Shaun the Sheep YouTube channel as "Mossy Bottom Shorts". These versions lack the 3D effect found on the Nintendo 3DS but have high-definition resolution.
Shaun The Sheep Championsheeps (2012)
The following is a list of 1-minute sports-themed shorts that aired on CBBC in July 2012. They were made to coincide with the London 2012 Olympics celebrations. it aired on BBC Kids in October 2013.
Notes
References
Shaun the Sheep
Shaun the Sheep episodes
Shaun the Sheep episodes
|
Zulaikha Abd ar-Rahman Abu Risha (born 1942; ) is a Jordanian poet and activist. She has been a vocal advocate of women's rights, particularly concerning making the Arabic language more gender-inclusive.
Early life and education
Zlaikha Abu Risha was born in 1942 in Acre, a city in what is now Israel. She describes herself as having Palestinian, Jordanian, and Syrian roots.
She studied Arabic literature at the University of Jordan, graduating with a bachelor's degree in 1966 and a master's in 1989. She later pursued a doctorate at the University of Exeter, where she wrote her thesis on "Women in Arabic Feminist Literature"
Career
Abu Risha is perhaps best known for her work as a poet and fiction writer. She has been considered a prominent member of the Jordanian literary scene.
In 1987, Abu Risha published the short story collection In the Cell, for which she won a prize from the University of Jordan. She has also written at least 10 books of poetry beginning in 1998, as well as a book of autobiographical essays, Ghajarul ma'a, in 1999. And she has produced several works of children's literature, as well as a 2002 academic study of the genre, Towards a Theory of Children’s Literature (2002).
Through hosting events in which refugees told folktales, she produced the book Timeless Tales: Folktales Told by Syrian Refugees, containing 21 folk stories.
Abu Risha also writes nonfiction on feminist criticism, literature, art, and gender and language. She has been a columnist for newspapers and magazines in Jordan and across the Arab world. She has also served as editor of the magazines al-Mu'allim/at-talib (published by UNESCO/UNRWA) and Al-Funun (an art journal published by the Jordanian Ministry of Culture), and as director of al-Warraqat li-d-dirasat wal-buhuth, a feminist publishing house. In 2019, she served as a judge for the International Prize for Arabic Fiction.
She has also worked as a university lecturer and served as director of the Center for Women's Studies in Amman, Jordan.
Abu Risha is also known for her work as a human rights and women's rights activist. She has fought to make the Arabic language more inclusive of women, writing two books on the subject: The Absent Language: Towards a Gender-Neutral Language (1996) and The Language Female: Papers on Discourse and Gender (2009). Her women's rights advocacy since the early 1980s has made her a target of extremist groups, which have sought to incite violence against her. She has also been the target of lawsuits from Amman's Public Prosecution Office for comments on Islam.
References
Living people
1942 births
People from Acre, Israel
Jordanian women writers
Jordanian women activists
Jordanian human rights activists
University of Jordan alumni
Jordanian academics
|
is a professional Japanese baseball infielder for the Hanshin Tigers.
Early baseball career
Seiya started playing little league softball in 1st grade for the Yasuda Yanyan Baseball club, then continued as a baseball player in junior high in his hometown in Aomori. He then went to play various infielder positions for Aomori Yamada High School, but his team never made it to any national tournaments.
He entered Asia University in Tokyo, and participated regularly as a starter in the Tohto University Baseball League. From his 3rd year, he helped his team win two successive league championships, as well as two championships in the Meiji Jingu National Tournament. In his 40 league appearances, he recorded a 0.236 batting average, 7 RBIs and 7 stolen bases.
Wanting to pursue a career in baseball despite not being selected in the professional drafts, he joined the industrial leagues under Honda where he played various infielder positions and batted as lead-off or clean-up during Intercity Baseball Tournaments. Because he never once recorded any home runs, his coaches in Honda helped him develop his lower body strength and changed his batting form which resulted to him hitting a total of 24 home runs during his 2 years in the league.
Hanshin Tigers
He was the Tiger's 3rd pick at the 2018 Nippon Professional Baseball draft. He signed a 60 million yen contract with Hanshin, for an estimated annual salary of 10 million. He was assigned the jersey number 0, the same number he wore when he played for Honda.
2019
He joined the main team for spring training in Okinawa, and competed with Fumiya Hojoh and veteran Takashi Toritani for the shortstop position. On March 23, he broke the NPB record for most hits by a rookie in the pre-season exhibition games when he notched his 22nd hit. This performance earned him the shortstop position and lead-off spot during the season opening card with the Yakult Swallows. On March 29, he and fellow rookie Koji Chikamoto became the first pair of Hanshin rookies in 47 years to bat as lead-off hitters during the season opener. But he eventually got removed from the line-up when he failed to produce a single hit. He notched his first career hit on April 12 as a pinch hitter against the Dragons, and a few days later, hit his first career home run with two runners on base against Tomoyuki Sugano.
His appearances gradually increased by June either as a lead-off or 2nd batter, and he even went on a 7-game hitting streak. But with the entry of newly-imported Yangervis Solarte into the team, he got alternated with Solarte until he was eventually removed from the line up later in July. He redeemed himself in August by batting 0.431, and went on a 13-game hitting streak which tied the team rookie record notched by Chikamoto earlier in the season.
He finished the season with an average of 0.262, and scored 32 RBIs including 4 home runs in 113 games.
References
External links
Nippon Professional Baseball Stats
1994 births
Living people
Hanshin Tigers players
Japanese baseball players
Baseball people from Aomori Prefecture
People from Aomori (city)
|
```java
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package com.baomidou.mybatisplus.core.toolkit.support;
import java.lang.invoke.MethodHandle;
import java.lang.invoke.MethodHandleProxies;
import java.lang.invoke.MethodHandles;
import java.lang.reflect.Executable;
import java.lang.reflect.Proxy;
/**
* IDEA Evaluate Lambda
* <p>
* Create by hcl at 2021/5/17
*/
public class IdeaProxyLambdaMeta implements LambdaMeta {
private final Class<?> clazz;
private final String name;
public IdeaProxyLambdaMeta(Proxy func) {
MethodHandle dmh = MethodHandleProxies.wrapperInstanceTarget(func);
Executable executable = MethodHandles.reflectAs(Executable.class, dmh);
clazz = executable.getDeclaringClass();
name = executable.getName();
}
@Override
public String getImplMethodName() {
return name;
}
@Override
public Class<?> getInstantiatedClass() {
return clazz;
}
@Override
public String toString() {
return clazz.getSimpleName() + "::" + name;
}
}
```
|
Bradley Moran (born 29 September 1986 in England) is a former Australian rules footballer who played for the North Melbourne Football Club and Adelaide Football Club in the Australian Football League (AFL). Since his retirement from the AFL in 2011, Moran has worked in business and founded two startup technology companies: NoQ in 2011 and CitrusAd in 2017. CitrusAd was acquired in July 2021 by French company Publicis for a reported $205 million.
Early life
Born in Solihull in the West Midlands of England to English parents, Moran's father Martyn was a former junior soccer player. Moran grew up in Stratford-upon-Avon with dreams of becoming a soccer player.
As a youth, he represented West Midlands county in rugby union. He was also a representative soccer player.
Moran moved to Australia with his family as a 15-year-old, looking to pursue a career in rugby.
One of his school friends encouraged him to try Aussie Rules with the Surfers Paradise AFC juniors, where he was mentored by former Brisbane Bears captain Roger Merrett. He took to the game quickly and developed a passion for it. His high school, The Southport School, had a policy against Australian Football, so he continued playing club football and instead also played basketball and rowing at school.
Moran quickly showed ability and talent in the ruck. He broke his wrist leading up to the Under 18 national championships, which set back his recruitment. However, after playing football with the Southport Sharks, at age 18 he was recruited to the elite level by the Kangaroos Football Club in the 2004 AFL draft.
AFL career
Wearing the number 18 guernsey, previously worn by Wayne Carey, Moran made his debut in 2006 against Hawthorn at Aurora Stadium. He played an effective game in one of the Roos' worst performances of the season. He collected 21 disposals, 10 marks and 10 hitouts, which earned a nomination for the AFL Rising Star Award. He continued his good form into the following week against Collingwood, where the Roos were comprehensively beaten in the second half.
Moran was traded to the Adelaide Football Club at the end of the 2007 Premiership season. He took up the number 2 guernsey, which was also previously worn by Carey during his brief stint at the Crows. After injuries ruined the first half of his 2008 season, he played his first game for the Crows in round 16 and impressed as a tall defender and ruckman, thereafter becoming a fixture in the lineup. When moved into the forward line against Carlton in round 18 to cover the loss of Jason Porplyzia, Moran booted four goals for the game to help the Crows to an eight-point victory, thus adding another string to his bow as a utility.
In 2009, Moran quickly became a regular in the Adelaide side, forming a ruck combination with Ivan Maric. However, midway through the season he injured his knee, which would keep him out for the remainder of the season.
Moran announced his retirement on 31 August 2011 after ongoing injuries.
International qualification
Moran was eligible to represent Great Britain at the 2008 Australian Football International Cup as the criteria at the time would have allowed him to compete as a part of the British team since he was born in England and moved to Australia as a teenager. However, Great Britain's national team deliberately overlooked Moran because he did not learn the game in England. Moran continues to be a strong advocate for the sport as an ambassador for Australia international rules football team, appearing in an England Dragonslayers guernsey for its junior league promotions.
Post-AFL career running startup technology businesses
In 2011, after his retirement from the Adelaide Crows, Moran jumped into a new field by launching a startup technology business called NoQ (pronounced no queue), which offered a smartphone application that allowed the user to dodge the queue by pre-ordering food and drinks such as coffee. During the next five years, Moran successfully raised capital for NoQ (now egrowcery.com) from investors including Bendigo & Adelaide Bank and secured clients such as Westfield Group and Noodle Box. He left the business in 2016 to move in to the customer experience sector.
Transactions became transformational in March 2017, when Moran and former NoQ colleague Nick Paech launched an advertising technology start-up called CitrusAd, based in Brisbane. By placing AI technology at the core of retailing strategy, Moran knew retailers could leverage their data in real time to improve sales and open a new line of revenue by allowing brands to easily and more effectively use the data for highly targeted ads at the point-of-purchase. With the impending deprecation of third-party cookies, sources of first-party data became a larger focus for marketers to target effectively. Retailer first-party data became coveted as software platforms such as the one Moran offered through CitrusAd would cull massive amounts of shopper data to ensure that relevant offers were served on retailer websites and apps in a more personalized manner. Moran’s self-serve, simple to use, white label platform became a “plug and play” option for retailers to compete with higher tech retailers such as Amazon. This upended the status quo as suddenly business roles reversed with retailers selling to suppliers and CPG suppliers / brands moving into buying roles to purchase retail media being supplied by their retail business customers. Moran used built in algorithms to reduce the inefficiencies of serving ads for products that were either out-of-stock or not relevant to the shopper while taking into account margin, velocity and other metrics.
Brands saw an advantage over traditional advertising tactics by having access to retailer data via the AI used for matching shoppers to applicable brands resulting in better ROAS (return on ad spend) for brand advertisers. Moran’s team built a closed-loop analytics dashboard on the CitrusAd platform to measure ROI from ads served to sales transactions. The platform became a key component in the digital transformation of retailing by delivering personalization at scale to improve shopper experiences.
In August 2018, Moran shared that 14 AU retailers including Dan Murphy’s were already using the platform. By June if 2019, CitrusAd signed Coles Group, Ocado, Techdata, Woolworths and the company’s first U.S.retailer, Hy-Vee.
In June 2020, MA Financial Group (part of Moelis & Company) announced investment of AU6.5 million, adding Moelis & Company to its list of growth-stage companies and funding CitrusAd’s international growth. Shortly thereafter, Moran announced CitrusAd had signed new international clients including Groupon in the U.S. and Sainsbury's in the UK.
In July 2021, French company Publicis Groupe announced it had acquired CitrusAd for an undisclosed amount. Moran continues to hold a senior executive role at CitrusAd under its new owner. In October 2021, the Australian Financial Review reported that Publicis had paid $205 million for CitrusAd.
In December 2021, Moran won the Digital Disruptor award in the 2021 Australian Young Entrepreneur Awards.
In a Forbes magazine article on February 28, 2022 Retail Media Networks Are One Of The Most Important Trends Of 2022, But They Need To Evolve, Moran is quoted as predicting that 2022 would see retailers and brands extend their long-standing knowledge of how in-store shelf placement affects sales to how online brand placement affects both online purchasing and in-store behavior.
Awards and recognitions
2019 Pearcey Queensland Entrepreneur Award Finalist.
Top 20 under 40 Young Business Stars.
Digital Disruptor award - 2020 & 2021 Australian Young Entrepreneur Awards.
Named as one of the Top 100 Digital Entrepreneurs in Australia.
References
External links
2006 WFN story – Brad Moran stars on debut for North Melbourne
1986 births
Living people
Adelaide Football Club players
Australian people of English descent
Australian rules footballers from Queensland
English players of Australian rules football
English rugby union players
North Melbourne Football Club players
Rugby union players from Solihull
Southport Australian Football Club players
VFL/AFL players born in England
West Adelaide Football Club players
|
The phrase "blood, toil, tears and sweat" became famous in a speech given by Winston Churchill to the House of Commons of the Parliament of the United Kingdom on 13 May 1940. The speech is sometimes known by that name.
Background
This was Churchill's first speech since becoming prime minister. It was made on 13 May 1940 to the House of Commons after having been offered the King's commission the previous Friday, to become Prime Minister of the United Kingdom in the first year of World War II. Churchill had replaced Neville Chamberlain on 10 May, and in this speech he asked the House to declare its confidence in his Government. The motion passed unanimously. This was the first of three speeches which he gave during the period of the Battle of France, which commenced with the German invasion of the Low Countries on 10 May.
History
Churchill had used similar phrases earlier, such as "Their sweat, their tears, their blood" in 1931, and "new structures of national life erected upon blood, sweat, and tears" in 1939.
Churchill's sentence, "I have nothing to offer but blood, toil, tears and sweat," has been called a paraphrase of one uttered on 2 July 1849 by Giuseppe Garibaldi when rallying his revolutionary forces in Rome: "I offer hunger, thirst, forced marches, battle, and death." As a young man, Churchill had considered writing a biography of Garibaldi. The circumstances under which Garibaldi made that speech—with the revolutionary Roman Republic being overwhelmed and Garibaldi needing to maintain the morale of his troops towards a highly hazardous retreat through the Apennine mountains—was in some ways comparable to Britain's situation with France being overwhelmed by the German offensive.
Theodore Roosevelt uttered a phrase similar to Churchill's in an address to the United States Naval War College on 2 June 1897, following his appointment as federal Assistant Secretary of the Navy: "Every man among us is more fit to meet the duties and responsibilities of citizenship because of the perils over which, in the past, the nation has triumphed; because of the blood and sweat and tears, the labor and the anguish, through which, in the days that have gone, our forefathers moved on to triumph."<ref>James A. Billington. 2010. [https://books.google.com/books?id=91IFAYFhtOMC&pg=PA13 Respectfully Quoted: A Dictionary of Quotations] Courier Dover Publications, p. 6.</ref> Churchill's line has been called a "direct quotation" from Roosevelt's speech. Churchill, a man with an American mother and a keen soldier, was likely to have read works by Theodore Roosevelt, who was a widely published military historian; it is also possible he read the speech after being appointed First Lord of the Admiralty, a position similar to Roosevelt's.
Other versions of the phrase are "It [poetry] is forged slowly and painfully, link by link, with blood and sweat and tears" (Lord Alfred Douglas, 1919), "Blood, sweat, and tear-wrung millions" (Lord Byron, 1823), and "...mollifie/ It with thy teares, or sweat, or blood" (John Donne, 1611). In Latin, Cicero and Livy had used the phrase "sweat and blood".
Excerpts
We are in the preliminary stage of one of the greatest battles in history.... That we are in action at many points—in Norway and in Holland—, that we have to be prepared in the Mediterranean. That the air battle is continuous, and that many preparations have to be made here at home.
I would say to the House as I said to those who have joined this government: "I have nothing to offer but blood, toil, tears and sweat". We have before us an ordeal of the most grievous kind. We have before us many, many long months of struggle and of suffering.
You ask, what is our policy? I will say: It is to wage war, by sea, land and air, with all our might and with all the strength that God can give us; to wage war against a monstrous tyranny, never surpassed in the dark and lamentable catalogue of human crime. That is our policy. You ask, what is our aim? I can answer in one word: Victory. Victory at all costs—Victory in spite of all terror—Victory, however long and hard the road may be, for without victory there is no survival.
(Text as given in Hansard)
Reaction
Churchill had not been the preferred choice of most Conservatives to succeed Chamberlain, but the motion on 13 May "That this House welcomes the formation of a Government representing the united and inflexible resolve of the nation to prosecute the war with Germany to a victorious conclusion" passed unanimously. He had been unpopular in many circles since the 1930s and MPs had ignored or heckled speeches in which he denounced the prime minister's appeasement policy toward Germany; even others who opposed Chamberlain avoided him. One historian has described the speech's effect on Parliament, however, as "electrifying ... He was still speaking at the House of Commons, but it was now listening, and cheering." (However, Churchill himself subsequently held that many Conservative MPs had still regarded him with reserve and it was not until his speech of 4 July 1940 announcing British action against the French fleet at Mers-el-Kébir that he could feel he had the full support of the whole House.) Other great speeches followed, including the "We shall fight on the beaches" speech of 4 June and the "This was their finest hour" speech of 18 June, and were a great inspiration and unifying force to Britain after its defeats in the first year of the war.
Legacy
On 26 April 2013, the Bank of England announced that beneath a portrait of Churchill the phrase "I have nothing to offer but blood, toil, tears and sweat." was to adorn the new 5-pound note. It was issued in September 2016.
See also
"We shall fight on the beaches"
"Never was so much owed by so many to so few"
References
Further reading
John Lukacs, Five Days in London: May 1940'' (Yale University, New Haven, 2001) is a good look at the political situation in the British government when Churchill made this speech
External links
The Churchill Centre: Blood, Toil, Tears and Sweat , with a short introduction
Transcription and MP3 recording
1940 in the United Kingdom
World War II speeches
British political phrases (Pre 1950)
Quotations from military
English-language idioms
Speeches by Winston Churchill
May 1940 events
1940 speeches
1940 in politics
1940s neologisms
|
, often referred to simply as Re:Zero and also known as Re: Life in a different world from zero, is a Japanese light novel series written by Tappei Nagatsuki and illustrated by Shin'ichirō Ōtsuka. It started serialization as a web novel on the user-generated website Shōsetsuka ni Narō in 2012. Thirty-five light novels, as well as five side story volumes and nine short story collections have been published by Media Factory under their MF Bunko J imprint. The story centers on Subaru Natsuki, a hikikomori who suddenly finds himself transported to another world on his way home from the convenience store.
The series' first three arcs have been adapted into separate manga series. The first, by Daichi Matsue, was published between June 2014 and March 2015. The second, by Makoto Fugetsu, has been published by Square Enix since October 2014. Matsue launched the third adaptation, also published by Media Factory, in May 2015. Additionally, Media Factory has published two anthology manga with stories by different artists. An anime television series adaptation produced by White Fox aired from April to September 2016, starting with an hour-long special. Two original video animation (OVA) episodes were released in October 2018 and November 2019. In March 2017, game developer 5pb. released a visual novel based on the series. A second season aired in a split-cour format, with the first half airing from July to September 2020, and the second half airing from January to March 2021. A third season has been announced.
The novels and all three manga adaptations are published in North America by Yen Press. The anime adaptation has been licensed by Crunchyroll outside Asia, which released the anime on home video through Funimation in the United States and Anime Limited in the United Kingdom. In Southeast Asia and South Asia, the series is licensed by Muse Communication.
The overall series (light novel and manga volumes) had over 13 million copies in circulation by March 2023 (including digital versions), while the anime series has sold more than 70,000 copies on home video. The light novels have been praised for their fresh take on the "another world" concept, fleshed-out characters, complex world and lore, and thought-provoking topics and themes. The series received awards at the 2015–2016 Newtype Anime Awards and the 2017 Sugoi Japan Awards, and was nominated for Anime of the Year at the 1st Crunchyroll Anime Awards.
Plot
Subaru Natsuki is a NEET who is suddenly summoned to a fantasy-like world. Just after arriving, he is killed while trying to help a young half-elf he befriends, Emilia, who is a candidate to become the next ruler of the Kingdom of Lugunica, only to revive some hours in the past. After dying some times, Subaru realizes that he has the power to turn back time after his death. After successfully helping Emilia, Subaru starts living in one of the Mansions of Roswaal L. Mathers as a butler. Out of gratitude and affection for Emilia, Subaru makes use of his newfound ability to protect her and help on her ambition to be successfully appointed as the next queen, also providing assistance to other friends he makes along the way, while suffering due to the pain inflicted on him every time he dies, and carrying along the memories of everything that happened before his power activates, which is forgotten by everybody except for him.
Production
Light novel
In the late 2000s, the light novel series The Familiar of Zero (Zero no Tsukaima) spawned a number of fan fiction on the website Shōsetsuka ni Narō ("Let's Become Novelists"), also known as Narō. Tappei Nagatsuki initially began writing The Familiar of Zero fan fiction on Narō, before building on its isekai ("other world") concept to write his own original web novel series on Narō, called Re:Zero, which began serialization in 2012.
The series' editor at MF Bunko J, Masahito Ikemoto, first became aware of the web novel in April 2013, when it began to appear on his Twitter feed. He was immediately impressed by the series' use of Return by Death, and how it was a "depressing, yet surprising, twist on the fantasy genre," and began working with Nagatsuki to adapt the series into a light novel. Most light novels are around 250 pages in length, but Nagatsuki submitted a manuscript of more than 1,000 pages for the first novel, forcing Ikemoto to edit it heavily. While Nagatsuki wanted to engage in worldbuilding early on, Ikemoto felt that it was more necessary to make the readers feel engaged with the characters. He ended up rearranging the story so that parts focusing on the world and its lore were pushed back to the third arc of the series.
Prior to his involvement in Re:Zero, illustrator Shin'ichirō Ōtsuka worked on video games, which led him to draw the backgrounds first when illustrating the series. After reading the web novel, he submitted a number of character designs for the major characters to Ikemoto. Subaru's initial design made him look like a delinquent, with Otsuka later describing it as "not the face of a boy in his teens," leading Ikemoto to request that the character be "more friendly and less fierce" so that the audience could empathize with him during emotional scenes. Originally, Emilia's character design appeared extremely plain, so a number of features were added to make her more interesting. Ikemoto specified that she must fit the "archetypal heroine" mold. Rem and Ram also underwent significant changes from the first draft: their original designs lacked the characteristic hair parts, and their maid uniforms were longer and more "traditional."
Anime
Development and production
The possibility of an anime adaptation came up early in the development of the series; Shō Tanaka, a producer at Kadokawa, asked Ikemoto about properties which might lend themselves to being animated, and Ikemoto recommended that Tanaka read Nagatsuki's web novels. Despite an initial miscommunication which led to Ikemoto believing that Tanaka wasn't interested, talks of adapting the series began soon after the web novels began the transition to print.
As part of talks for the potential anime adaptation, Ikemoto and Tanaka spoke to Tsunaki Yoshikawa, an animation producer at studio White Fox, about the possibility of his studio animating the series. Hoping to adapt the series into an anime similar to Steins;Gate (which White Fox also produced), and having a positive impression of the studio as one that did faithful adaptations, Tanaka then formally approached them about producing the show. White Fox's president contacted Yoshikawa for his opinion, and Yoshikawa recommended they accept, as long as the series "doesn't violate any broadcasting regulations."
Production on the anime began sometime after the release of the fifth novel in October 2014. Masaharu Watanabe was chosen by Yoshikawa to direct the series because he had previously worked for the studio doing key animation, while Kyūta Sakai was chosen to be the series' character designer and chief animation director because Yoshikawa felt that she would be able to do the novel's art justice whilst maintaining a consistent animation quality throughout the series' 25-episode run. Masahiro Yokotani was brought on board as the main writer, with the series being his first time composing for a "reborn in another world"-type story. Yoshikawa warned him about the violence in the series, but Yokotani was still surprised by the violent and disturbing scenes in novels three and beyond, having only read the first novel when he agreed to work on the project; he delegated the script writing of those episodes in the second cour to the other two scriptwriters. Yoshiko Nakamura joined the project sometime after Yokotani had completed the script for episode 3. When it proved unfeasible for Yokotani and Nakamura to write the scripts alone, the decision was made to bring another scriptwriter on board. Gaku Iwasa, the president of White Fox, asked them to hire someone "younger," leading Yokotani to suggest Eiji Umehara. Nagatsuki had recently been playing Chaos;Child, which Umehara had written for, and he approved the choice, suggesting that they let Umehara write the "painful parts"; Umehara was invited to join the project around the time that the scripts for episodes 8 and 9 were being written. Re:Zero was the first light novel adaptation that either of the screenwriters had worked on.
Original author Tappei Nagatsuki was very active in the production of the anime, attending script meetings and recording sessions. When the staff would encounter a problem with a scene, he would occasionally write lines for them to use as reference while writing the script. The series was not initially intended to have 25 episodes, but was extended to give more time to the battle with the White Whale (which was expanded from two to three episodes) and to the content of episode 18 (episodes 16 to 18 were originally supposed to be covered in two episodes). Watanabe's main directive to the staff was to "capture the mood of the novel as much as possible"; the scriptwriters had discussions about how to compress the dense source material without losing the central elements of the story, and Nakamura recalls working with composition notes that "went on for pages." While planning and scripting the anime, choosing a proper conclusion was one of the most difficult parts for the staff, and a significant amount of time was devoted to choosing what to cover in the final episode, which included material not yet covered in the light novel.
After joining the project, both Nakamura and Umehara had to adjust their views of the main character, and were forced to rewrite scenes where they had made Subaru appear "cool." At Watanabe's direction, Nakamura was made to rewrite Subaru's telling of The Red Ogre Who Cried in episode 6 multiple times. The staff also had difficulty deciding on a song to use for Subaru's ringtone that plays during the closing scene of episode 19, considering songs like "Kanpaku Sengen," "The Beard Song," and "M" by Princess Princess, before settling on "Yoake no Michi" from Dog of Flanders.
Soundtrack
While choosing a composer to produce the series' music, director Watanabe wanted to choose someone who had "hit a nerve" with him. A fan of drama series, Watanabe was struck by a piece of music in the medical drama Death's Organ, and found that the series' composer, Kenichiro Suehiro, had also worked on a number of his favorite anime and drama series. After Suehiro was attached to the production, Watanabe gave him three major guidelines: use human voices during the Return by Death sequences; compose the music like he would for a drama or a movie to capture the emotional scenes; and "pull all the stops" for the suspenseful scenes. Additionally, for the first cour, Watanabe asked for music with a "suspenseful" vibe, while requesting music with a "romantic" feel for the second cour. Both Watanabe and Suehiro are fans of Italian composer Ennio Morricone, and Suehiro tried to take inspiration from his works while composing the soundtrack. Watanabe also requested that there be songs that mimicked Hans Zimmer's score from The Dark Knight. While Suehiro used music that wasn't very "anime-ish" during most of the series, he was asked to use more traditional anime music during the slice of life scenes. A number of times during the series, such as in episodes 7 and 15, Watanabe made it a point to use an entire song, something which is unusual in most anime.
The series makes limited use of its opening and ending themes, and Watanabe has said that he wished he could use them more frequently.
Media
Web novel
The Re:Zero web novel was initially serialized by Tappei Nagatsuki (writing under the username ) on the user-generated content site Shōsetsuka ni Narō from April 20, 2012, onwards. As of February 13, 2021, six story arcs have been completed and two "EX" side stories have been published, with the seventh arc in progress. In total the webnovel has 609 chapters available.
Light novels
Following the web novel's publication, Media Factory acquired the series for print publication. The first light novel volume, with illustrations by Shin'ichirō Ōtsuka, was published on January 24, 2014, under their MF Bunko J imprint. As of September 2023, thirty-five volumes have been published, as well as five side story volumes and nine short story collections. Nagatsuki and Otsuka began publishing a series of short side-stories focusing on characters from the series in Monthly Comic Alive, starting with the character Elsa in August 2016. It was followed with one focused on Petra Leyte on November 26, 2016, and one featuring Ram and Rem on January 27, 2017. The light novels are published in English by Yen Press, who announced their acquisition of the license via Twitter on December 2, 2015. The publisher has also acquired the license to the Re:Zero EX side novels.
Manga
A manga adaptation by Daichi Matsue, titled , began serialization in the August 2014 issue of Media Factory's seinen manga magazine Monthly Comic Alive on June 27, 2014. The final volume was released on March 23, 2015. On December 2, 2015, Yen Press announced that they had licensed the series.
A second manga, titled , with art by Makoto Fugetsu, began serialization in Square Enix's seinen magazine Monthly Big Gangan on October 25, 2014. The final chapter was published on December 24, 2016, and an extra chapter was published on January 25, 2017. The second adaptation has also been licensed by Yen Press.
Daichi Matsue began serializing a third manga, in Comic Alive July 2015 issue on May 27, 2015. Yen will publish the third adaptation as well.
A manga anthology, titled , was published by Media Factory on June 23, 2016. A second anthology was published on September 23, 2017.
Internet radio show
An Internet radio show to promote the series, named , began broadcasting on March 27, 2016. The show was aired every Monday and was hosted by Rie Takahashi, the voice actress for Emilia. Guests that appeared on the show included Yūsuke Kobayashi (Subaru Natsuki), Inori Minase (Rem), Yumi Uchiyama (Puck), Rie Murakawa (Ram), Satomi Arai (Beatrice), Chinatsu Akasaki (Felt), Kana Ueda (Anastasia Hoshin), and Yui Horie (Felix). The show ran for 33 episodes and concluded on December 19, 2016. The first radio CD, which contains episodes 1–8 of the show, was released on June 27, 2016. The second, which contains episodes 9–16 of the show, was released on September 28, 2016. The third, containing episodes 17–24, was released on November 30, 2016, and the fourth, containing episodes 25–33, was released on March 29, 2017.
Anime
An anime television series adaptation was announced by Kadokawa in July 2015. The series is directed by Masaharu Watanabe and written by Masahiro Yokotani, with animation by the studio White Fox. Kyuta Sakai is serving as both character designer and as chief animation director. Music for the series is composed by Kenichiro Suehiro. Kentaro Minegishi is the series' director of photography, and Yoshito Takamine serves as art director. Jin Aketagawa handled sound direction for the anime, and sound effects were produced by Yuji Furuya. Other staff members include Hitomi Sudo (editing), Yu Karube (3D director), Saaya Kinjō (art configuration), Izumi Sakamoto (color design), and Noritaka Suzuki and Gōichi Iwabatake (prop design).
The 25-episode series premiered on April 4, 2016, with an extended 50-minute first episode. It was broadcast on TV Tokyo, TV Osaka, TV Aichi, and AT-X. The series was simulcast by Crunchyroll. Episode 14 and 18 ran 2 minutes longer than a typical anime episode, clocking at 25 minutes and 45 seconds. The final episode ran 3 minutes longer, clocking at 27 minutes and 15 seconds.
A series of anime shorts featuring chibi versions of the characters, titled , were produced by Studio Puyukai to accompany the series. The shorts ran for eleven episodes before being replaced by a new series of shorts, titled , which began airing on June 24, 2016, and ran for 14 episodes. The shorts are directed, written, and produced by Minoru Ashina, with character designs by Minoru Takehara, who also animated the series alongside Sumi Kimoto and Chisato Totsuka. Kenichiro Suehiro reprised his role as composer for the shorts, while Tomoji Furuya of Suwara Pro produced the sound effects. Jin Aketagawa directed the sound at production company Magic Capsule.
The shorts aired on AT-X after each episode of the regular series, starting on April 8, 2016. Crunchyroll acquired the streaming rights to both shorts. Animax Asia later aired the series starting on January 13, 2017.
An original video animation (OVA) episode was announced at the "MF Bunko J Summer School Festival 2017" event on September 10, 2017. All of the main staff and cast returned for the OVA, with Tatsuya Koyanagi joining as chief director. Titled Memory Snow, the OVA was screened in Japanese theaters starting on October 6, 2018. A second OVA, titled , was announced on September 23, 2018. The OVA is an adaptation of the prequel novel which was included with the first Japanese Blu-ray release of the television series, and focused on the meeting of Emilia and Puck. It was released in Japanese theaters on November 8, 2019.
The series is licensed by Crunchyroll outside of Asia and by Muse Communication in Southeast Asia and South Asia. Funimation announced during their Katsucon 2018 panel that they will release it on home video with an English dub in North America as part of the two companies' partnership. Funimation released the first part of the first season on DVD and Blu-ray in North America on June 19, 2018;, and the second part on February 5, 2019. Funimation later released all of season 1 on one Blu-ray on June 9, 2020. In the United Kingdom, the series is distributed by Anime Limited, who released the first part of the first season on DVD on June 25, 2018 and the second part on May 20, 2019. Anime Limited later released both parts on Blu-ray on August 19, 2019, with a complete collection being released on February 15, 2021. Both Funimation and Anime Limited's Season 1 Part 1 Blu-ray releases received negative attention after it was discovered that they showed visible color banding and compression artifacts. Madman Anime released the first part of the first season in Australia on Blu-ray on May 8, 2019.
On March 23, 2019, it was announced that a second season was in production. The cast and staff would reprise their roles for the second season. It was scheduled to premiere in April 2020, but was delayed to July 2020 due to production complications caused by the COVID-19 pandemic. Before the arrival of the second season's initial premier date, an edited version of the first season premiered on January 1, 2020, on AT-X and other channels, with the edited version recapping the first season through one-hour episodes. It also included new additional footage. The OVA "Memory Snow" was also broadcast in between the episodes 5 and 6 of the edited version.
The second season was announced to be in a split-cour format, with the first half airing from July 8 to September 30, 2020, and the second half airing from January 6 to March 24, 2021. The English dub for Season 2 Part 1 began airing from August 26 onwards. There is no standard length for each episode and the runtime is varying from 24 to 30 minutes.
A third season of the anime series was announced at AnimeJapan 2023.
The series also became part of Isekai Quartet, a crossover comedy series with characters drawn in a chibi style. It also features characters from the light novel series KonoSuba, Overlord, and The Saga of Tanya the Evil, all published by Kadokawa Corporation. The anime premiered on April 9, 2019. A second season aired in 2020, while a theatrical film premiered in 2022.
Music
The first opening theme song was "Redo" by Konomi Suzuki, and the first ending theme was "Styx Helix" by Myth & Roid, while for episode 7 the ending theme was "Straight Bet," also by Myth & Roid. The second opening theme song, titled "Paradisus-Paradoxum," was performed by Myth & Roid, while the second ending theme, "Stay Alive," was performed by Rie Takahashi. Myth & Roid also performed the ending theme for episode 14 titled "theater D."
The second season's first opening theme song was "Realize" by Konomi Suzuki, while the second season's first ending theme song was "Memento" by Nonoc. The second season's second opening theme song was "Long Shot" by Mayu Maeshima (former vocalist of Myth & Roid), while the second season's second ending theme song was "Believe In You" by Nonoc.
The series' soundtrack was released on CD on October 26, 2016. The disk contains 21 tracks composed by Kenichiro Suehiro.
"Redo," Suzuki's 10th single, was released on CD on May 11, 2016. The single was also released as a limited edition with a DVD featuring a music video, a live concert video, and a "making of" video. The songs were performed by Suzuki, with lyrics by Genki Mizuno and arrangement by Makoto Miyazaki.
The CD for "Styx Helix," the series' first ending theme, was Myth & Roid's 3rd single. Written, arranged, and performed by the group, it was released on May 25, 2016, and included both regular and instrumental versions of "Styx Helix" and "STRAIGHT BET."
"Stay Alive," the second ending theme, was released as a single on August 24, 2016. The songs were performed by Takahashi (Emilia) and Minase (Rem). The songs were written and arranged by Heart's Cry.
Myth & Roid released the second opening theme as a single on August 24, 2016. The CD included regular and instrumental versions of "Paradisus-Paradoxum" and "theater D."
For Memory Snow, three pieces of theme music were used: the ending theme "Memory Memory Snow" and the image song "Relive" by Nonoc, and the insert song "Memories" by Riko Azuna.
Video games
In August 2016, game developer 5pb. announced that they were developing a visual novel based on the series, titled . The game follows an original story that differs from the light novel and the anime, and allows the player to choose between routes featuring Emilia, Rem, Ram, Felt, Beatrice, Crusch, Priscilla, or Anastasia. A DLC allows players who pre-ordered the game to replace the character's costumes with swimsuits. The opening theme, , was performed by Suzuki, who sung the anime's first opening theme, while the ending theme, , was performed by Minase and Murakawa. The game has received a generally positive score of 30/40 on Famitsu.
In Japan, the game was originally scheduled to be released for PlayStation 4 and PlayStation Vita on March 23, 2017, but was delayed to March 30, 2017, due to certain circumstances. The limited edition of the game came with a soundtrack CD and either a Ram (for the PS4 version) or Rem (for the PSVita version) SD figure.
A virtual reality app that allows the user to interact with the character Rem was released for iOS and Android on May 26, 2017. A version featuring the character Emilia was released on June 6, 2017. The game was later ported to both PC and to the PlayStation VR.
A role-playing mobile game called Re:Zero − Starting Life in Another World: Infinity that was made by Tianjin Tianxiang Interactive Technology and authorized by White Fox was released on January 14, 2020, in China. Another mobile game that made by Sega titled was released for Android and iOS on September 9, 2020. In the game, the player can become protagonist Subaru Natsuki and relive the story of the anime. From there, the player can branch into “What If” stories. Furthermore, a new story original to the game was produced under the full supervision of original author Tappei Nagatsuki.
A tactical adventure video game was developed by Chime and published by Spike Chunsoft titled and was released for PlayStation 4, PC, and Nintendo Switch in January 2021. The game has an original story and was produced under the full supervision of original author Tappei Nagatsuki and illustrated by the series' illustrator Shinichiou Otsuka. It is the first official Re:Zero game to have an English release.
A role-playing browser game called Re:Zero -Starting Life in Another World- Forbidden Book and the Mysterious Spirit that was made by DMM Games released on July 14, 2021, in Japan. On March 14, 2022, DMM Games has announced the game will be shut down on July 14, 2022.
A mobile game called Re:Zero − Starting Life in Another World: Witch's Re;surrection was announced at the series' stage in AnimeJapan 2023.
Other media
Kadokawa published a 272-page guide to the series' first three arcs, titled Re:zeropedia, alongside the 10th volume of the novels on October 24, 2016. An official dōjinshi art book was published at Comiket, with art by Ponkan 8 (Shirobako and My Youth Romantic Comedy Is Wrong, As I Expected), Yuka Nakajima (Listen to Me, Girls. I Am Your Father!, Amagi Brilliant Park), and TakayaKi (Arifureta Shokugyou de Sekai Saikyou). A crossover with Natsume Akatsuki's light novel series KonoSuba, titled Re:Starting Life Blessing This World was published on December 21, 2016. The book featured interviews with each series' authors and illustrators, as well as the principal voice actors in their respective anime adaptations. A one-shot crossover manga by Daichi Matsuse and Masahito Watari (illustrator of the KonoSuba manga adaptation) was also included. A fanbook containing commentary on the episodes of the anime, as well as the collected Animate Times cast and staff interviews, was published on December 31, 2016. Bushiroad released a Booster Pack set and Trial Deck+ of Re:ZERO -Starting Life in Another World- for Weiß Schwarz on December 28, 2018.
Reception
According to Japanese light novel news website LN News, the series had 1 million copies in print by June 2016; over 2 million by September of the same year; and over 3.1 million by May 2017. It had over 11 million copies in circulation by January 2022. The overall series (light novel and manga adaptations volumes) had over 13 million copies in circulation by March 2023 (including digital versions). The light novel series was the tenth best-selling light novel series in Japan between November 2015 and May 2016, selling 263,357 copies. During that period, the first and second volumes were the 35th and 48th best-selling light novel volumes, selling 49,194 and 41,617 copies, respectively. The series was the fourth best-selling series in 2016, selling 1,007,381 copies between November 2015 and November 2016. Its first three volumes were the fourteenth, 21st, and 30th best selling volumes of the year, selling 155,363, 127,970, and 110,574 copies, respectively. In 2017, the series was the third best-selling series, with 925,671 copies sold. Its first, tenth, eleventh, and twelfth volumes respectively ranked nineteenth (60,135 copies), 25th (56,001 copies), seventh (101,480 copies), and twelfth (79,431 copies) in the period between November 2016 and May 2017. In 2019, the series sold 550,202 copies.
The series was the 21st best selling anime series on home video during 2016, selling approximately 68,791 Blu-ray and DVD sets. The OVA, "Memory Snow," released in 2018, sold a total of 10,429 Blu-ray and DVD copies.
Theron Martin of Anime News Network reviewed the first book, praising it for being a somewhat fresher take on the "transported to another world" concept, but leveled criticism at it for bumpy and awkwardly timed dialogue and a tendency for redundancy.
The series ranked first in a poll of 820 people conducted by the Japanese website Anime! Anime!, to determine the best show of spring 2016. Andy Hanley from UK Anime Network considered the anime adaptation as one of 2016's best series.
The Managing Editor from Anime Now!, Richard Eisenbeis lists the anime as one of his top picks from 2016 for its "culturally complex" world and characters that have "their own plans, faults, and motivations." He praised Subaru as the "most complex character of the year" due to provoking the audience to "cheer him and despise him" in a world that portrayed him as the "least special person in it."
The series took second place in the 2015–2016 Newtype Anime Awards. Additionally, director Masaharu Watanabe took first place, as did Subaru, Rem, and Puck (in the best male, female, and mascot character categories, respectively). Masahiro Yokotani's screenplay took second place, while the series' character designs (by Shin'ichirō Ōtsuka and Kyuta Sakai) took third place. The series' soundtrack and second opening theme both took fourth place in their categories. The light novels and the anime both took first place in their respective categories in the 2017 Sugoi Japan Awards.
In a survey of (primarily female) Otamart users, the series was ranked second on a list of the most successful anime/manga/light novel franchises of 2016. Re:Zero was nominated for "Anime of the Year" at the Crunchyroll's inaugural Anime Awards in 2016, and was also the service's most-watched series of 2016, topping Yuri on Ice.
References
Notes
Citations
External links
at Shōsetsuka ni Narō
at Big Gangan
at 5pb.
2014 Japanese novels
2016 anime ONAs
2017 video games
2018 anime OVAs
Adventure anime and manga
Anime and manga based on light novels
Comics about time travel
Crunchyroll Anime Awards winners
Crunchyroll anime
Dark fantasy anime and manga
Funimation
Gangan Comics manga
Isekai anime and manga
Isekai novels and light novels
Japanese adventure novels
Japanese fantasy novels
Japanese time travel television series
Kadokawa Dwango franchises
Light novels first published online
Light novels
MF Bunko J
Mages (company)
Mass media franchises
Media Factory manga
Muse Communication
Mythopoeia
Novels about time travel
PlayStation 4 games
PlayStation Vita games
Seinen manga
Seven deadly sins in popular culture
Shōsetsuka ni Narō
Studio Puyukai
TV Tokyo original programming
Television shows based on light novels
Time loop anime and manga
Time loop novels
Upcoming anime television series
Video games based on novels
Visual novels
White Fox
Yen Press titles
Yōkai in anime and manga
|
```python
#
#
# path_to_url
#
# Qt Components
from .qt import QtCore, QtGui, QtWidgets
from .common import KeyboardShortcuts
class AppEventFilter(QtCore.QObject):
'''This class's primary responsibility is delivering key events to
"the right place". Given usdview's simplistic approach to shortcuts
(i.e. just uses the native Qt mechanism that does not allow for
context-sensitive keypress dispatching), we take a simplistic approach
to routing: use Qt's preferred mechanism of processing keyPresses
only in widgets that have focus; therefore, the primary behavior of this
filter is to track mouse-position in order to set widget focus, so that
widgets with keyboard navigation behaviors operate when the mouse is over
them.
We add one special behaviors on top of that, which is to turn unaccepted
left/right events into up/down events for TreeView widgets, because we
do not have a specialized class on which to provide this nice navigation
behavior.'''
# in future it would be a hotkey dispatcher instead of appController
# that we'd dispatch to, but we don't have one yet
def __init__(self, appController):
QtCore.QObject.__init__(self)
self._appController = appController
def IsNavKey(self, key, modifiers):
# Note that the arrow keys are considered part of the keypad on macOS.
return (key in (QtCore.Qt.Key_Left, QtCore.Qt.Key_Right,
QtCore.Qt.Key_Up, QtCore.Qt.Key_Down,
QtCore.Qt.Key_PageUp, QtCore.Qt.Key_PageDown,
QtCore.Qt.Key_Home, QtCore.Qt.Key_End,
KeyboardShortcuts.FramingKey)
and modifiers in (QtCore.Qt.NoModifier,
QtCore.Qt.KeypadModifier))
def _IsWindow(self, obj):
if isinstance(obj, QtWidgets.QWidget):
return obj.isWindow()
else:
return isinstance(obj, QtGui.QWindow)
def TopLevelWindow(self, obj):
parent = obj.parent()
return obj if (self._IsWindow(obj) or not parent) else self.TopLevelWindow(parent)
def WantsNavKeys(self, w):
if not w or self._IsWindow(w):
return False
# The broader test would be QtWidgets.QAbstractItemView,
# but pragmatically, the TableViews in usdview don't really
# benefit much from keyboard navigation, and we'd rather
# allow the arrow keys drive the playhead when such widgets would
# otherwise get focus
elif isinstance(w, QtWidgets.QTreeView):
return True
else:
return self.WantsNavKeys(w.parent())
def NavigableOrTopLevelObject(self, w):
if (not w or
self._IsWindow(w) or
isinstance(w, QtWidgets.QTreeView) or
isinstance(w, QtWidgets.QDialog)):
return w
else:
parent = w.parent()
return w if not parent else self.NavigableOrTopLevelObject(parent)
def JealousFocus(self, w):
return (isinstance(w, QtWidgets.QLineEdit) or
isinstance(w, QtWidgets.QComboBox) or
isinstance(w, QtWidgets.QTextEdit) or
isinstance(w, QtWidgets.QPlainTextEdit) or
isinstance(w, QtWidgets.QAbstractSlider) or
isinstance(w, QtWidgets.QAbstractSpinBox) or
isinstance(w, QtWidgets.QWidget) and w.windowModality() in [QtCore.Qt.WindowModal,
QtCore.Qt.ApplicationModal])
def SetFocusFromMousePos(self, backupWidget):
# It's possible the mouse isn't over any of our windows at the time,
# in which case use the top-level window of backupWidget.
overObject = QtWidgets.QApplication.widgetAt(QtGui.QCursor.pos())
topLevelObject = self.NavigableOrTopLevelObject(overObject)
focusObject = topLevelObject if topLevelObject else self.TopLevelWindow(backupWidget)
if focusObject and isinstance(focusObject, QtWidgets.QWidget):
focusObject.setFocus()
def eventFilter(self, widget, event):
# There is currently no filtering we want to do for modal or popups
if (QtWidgets.QApplication.activeModalWidget() or
QtWidgets.QApplication.activePopupWidget()):
return False
currFocusWidget = QtWidgets.QApplication.focusWidget()
# Check for ShortcutOverride events to ensure we pick up navigation keys
# that have been set as shortcuts for QActions. We still want to
# dispatch those to the focus widget as needed.
if (event.type() == QtCore.QEvent.ShortcutOverride):
if (self.IsNavKey(event.key(), event.modifiers()) and
self.WantsNavKeys(currFocusWidget)):
event.setAccepted(True)
return True
elif (event.type() == QtCore.QEvent.KeyPress):
key = event.key()
isNavKey = self.IsNavKey(key, event.modifiers())
if key == QtCore.Qt.Key_Escape:
# ESC resets focus based on mouse position, regardless of
# who currently holds focus
self.SetFocusFromMousePos(widget)
return True
elif currFocusWidget and self.JealousFocus(currFocusWidget):
# Don't touch if there's a greedy focus widget
return False
elif (isNavKey and self.WantsNavKeys(currFocusWidget)):
# Special handling for navigation keys:
# 1. When a "navigable" widget is focussed (a TreeView),
# route arrow keys to the widget and consume them
# 2. To make for snappier navigation, when the TreeView
# won't accept a left/right because an item is already
# opened or closed, turn it into an up/down event. It
# WBN if this behavior could be part of the widgets
# themselves, but currently, usdview does not specialize
# a class for its TreeView widgets.
event.setAccepted(False)
currFocusWidget.event(event)
accepted = event.isAccepted()
if (not accepted and
key in (QtCore.Qt.Key_Left, QtCore.Qt.Key_Right)):
advance = (key == QtCore.Qt.Key_Right)
altNavKey = QtCore.Qt.Key_Down if advance else QtCore.Qt.Key_Up
subEvent = QtGui.QKeyEvent(QtCore.QEvent.KeyPress,
altNavKey,
event.modifiers())
QtWidgets.QApplication.postEvent(currFocusWidget, subEvent)
event.setAccepted(True)
return True
elif isNavKey:
if self._appController.processNavKeyEvent(event):
return True
elif (event.type() == QtCore.QEvent.MouseMove and
not self.JealousFocus(currFocusWidget)):
self.SetFocusFromMousePos(widget)
# Note we do not consume the event!
# During startup, Qt seems to queue up events on objects that may
# have disappeared by the time the eventFilter is called upon. This
# is true regardless of how late we install the eventFilter, and
# whether we process pending events before installing. So we
# silently ignore Runtime errors that occur as a result.
try:
return QtCore.QObject.eventFilter(self, widget, event)
except RuntimeError:
return True
```
|
Dmitri Grigoryevich Turutin (; born 10 April 1981) is a former Russian professional football player.
Club career
He played in the Russian Football National League for FC Baltika Kaliningrad in 2000.
References
External links
1981 births
Sportspeople from Kaliningrad
Living people
Russian men's footballers
Men's association football midfielders
FC Baltika Kaliningrad players
FC Sibir Novosibirsk players
FC Dynamo Barnaul players
FC Sakhalin Yuzhno-Sakhalinsk players
FC Sibiryak Bratsk players
|
```c++
///////////////////////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////////////////////////////////////////////////////////
// Created : 2005-12-21
// Updated : 2005-12-21
// File : glm/gtx/orthonormalize.inl
///////////////////////////////////////////////////////////////////////////////////////////////////
namespace glm
{
template <typename T, precision P>
GLM_FUNC_QUALIFIER detail::tmat3x3<T, P> orthonormalize
(
const detail::tmat3x3<T, P>& m
)
{
detail::tmat3x3<T, P> r = m;
r[0] = normalize(r[0]);
float d0 = dot(r[0], r[1]);
r[1] -= r[0] * d0;
r[1] = normalize(r[1]);
float d1 = dot(r[1], r[2]);
d0 = dot(r[0], r[2]);
r[2] -= r[0] * d0 + r[1] * d1;
r[2] = normalize(r[2]);
return r;
}
template <typename T, precision P>
GLM_FUNC_QUALIFIER detail::tvec3<T, P> orthonormalize
(
const detail::tvec3<T, P>& x,
const detail::tvec3<T, P>& y
)
{
return normalize(x - y * dot(y, x));
}
}//namespace glm
```
|
```ruby
class MarkdownlintCli < Formula
desc "CLI for Node.js style checker and lint tool for Markdown files"
homepage "path_to_url"
url "path_to_url"
sha256 your_sha256_hash
license "MIT"
bottle do
rebuild 1
sha256 cellar: :any_skip_relocation, arm64_sonoma: your_sha256_hash
sha256 cellar: :any_skip_relocation, arm64_ventura: your_sha256_hash
sha256 cellar: :any_skip_relocation, arm64_monterey: your_sha256_hash
sha256 cellar: :any_skip_relocation, sonoma: your_sha256_hash
sha256 cellar: :any_skip_relocation, ventura: your_sha256_hash
sha256 cellar: :any_skip_relocation, monterey: your_sha256_hash
sha256 cellar: :any_skip_relocation, x86_64_linux: your_sha256_hash
end
depends_on "node"
def install
system "npm", "install", *std_npm_args
bin.install_symlink Dir["#{libexec}/bin/*"]
end
test do
(testpath/"test-bad.md").write <<~EOS
# Header 1
body
EOS
(testpath/"test-good.md").write <<~EOS
# Header 1
body
EOS
assert_match "MD022/blanks-around-headings Headings should be surrounded by blank lines",
shell_output("#{bin}/markdownlint #{testpath}/test-bad.md 2>&1", 1)
assert_empty shell_output("#{bin}/markdownlint #{testpath}/test-good.md")
end
end
```
|
```go
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
package lambda
import "net/http"
type APIGatewayError interface {
StatusCode() int
}
type BadRequestError struct {
Message string
}
func (e BadRequestError) Error() string {
if e.Message == "" {
return "bad request"
}
return e.Message
}
func (BadRequestError) StatusCode() int { return http.StatusBadRequest }
type NotFoundError struct {
Message string
}
func (e NotFoundError) Error() string {
return e.Message
}
func (NotFoundError) StatusCode() int { return http.StatusNotFound }
type UnauthorizedError struct {
Message string
}
func (e UnauthorizedError) Error() string {
return e.Message
}
func (UnauthorizedError) StatusCode() int { return http.StatusUnauthorized }
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.