Back in-memory caches with Guava, disk caches with H2

Instead of using Ehcache for in-memory caches, use Guava. The Guava
cache code has been more completely tested by Google in high load
production environments, and it tends to have fewer bugs. It enables
caches to be built at any time, rather than only at server startup.

By creating a Guava cache as soon as it is declared, rather than
during the LifecycleListener.start() for the CachePool, we can promise
any downstream consumer of the cache that the cache is ready to
execute requests the moment it is supplied by Guice. This fixes a
startup ordering problem in the GroupCache and the ProjectCache, where
code wants to use one of these caches during startup to resolve a
group or project by name.

Tracking the Gauva backend caches with a DynamicMap makes it possible
for plugins to define their own in-memory caches using CacheModule's
cache() function to declare the cache. It allows the core server to
make the cache available to administrators over SSH with the gerrit
show-caches and gerrit flush-caches commands.

Persistent caches store in a private H2 database per cache, with a
simple one-table schema that stores each entry in a table row as a
pair of serialized objects (key and value). Database reads are gated
by a BloomFilter, to reduce the number of calls made to H2 during
cache misses. In theory less than 3% of cache misses will reach H2 and
find nothing. Stores happen on a background thread quickly after the
put is made to the cache, reducing the risk that a diff or web_session
record is lost during an ungraceful shutdown.

Cache databases are capped around 128M worth of stored data by running
a prune cycle each day at 1 AM local server time. Records are removed
from the database by ordering on the last access time, where last
accessed is the last time the record was moved from disk to memory.

Change-Id: Ia82d056796b5af9bcb1f219fe06d905c9c0fbc84
This commit is contained in:
Shawn O. Pearce 2012-05-24 14:28:40 -07:00
parent 34d4d1929a
commit 2e1cb2b849
76 changed files with 2432 additions and 1674 deletions

View File

@ -354,8 +354,8 @@ Default is unset, no disk cache.
[[cache.name.maxAge]]cache.<name>.maxAge::
+
Maximum age to keep an entry in the cache. If an entry has not
been accessed in this period of time, it is removed from the cache.
Maximum age to keep an entry in the cache. Entries are removed from
the cache and refreshed from source data every maxAge interval.
Values should use common unit suffixes to express their setting:
+
* s, sec, second, seconds
@ -371,7 +371,7 @@ If a unit suffix is not specified, `minutes` is assumed. If 0 is
supplied, the maximum age is infinite and items are never purged
except when the cache is full.
+
Default is `90 days` for most caches, except:
Default is `0`, meaning store forever with no expire, except:
+
* `"adv_bases"`: default is `10 minutes`
* `"ldap_groups"`: default is `1 hour`
@ -379,33 +379,42 @@ Default is `90 days` for most caches, except:
[[cache.name.memoryLimit]]cache.<name>.memoryLimit::
+
Maximum number of cache items to retain in memory. Keep in mind
this is total number of items, not bytes of heap used.
The total cost of entries to retain in memory. The cost computation
varies by the cache. For most caches where the in-memory size of each
entry is relatively the same, memoryLimit is currently defined to be
the number of entries held by the cache (each entry costs 1).
+
For caches where the size of an entry can vary significantly between
individual entries (notably `"diff"`, `"diff_intraline"`), memoryLimit
is an approximation of the total number of bytes stored by the cache.
Larger entries that represent bigger patch sets or longer source files
will consume a bigger portion of the memoryLimit. For these caches the
memoryLimit should be set to roughly the amount of RAM (in bytes) the
administrator can dedicate to the cache.
+
Default is 1024 for most caches, except:
+
* `"adv_bases"`: default is `4096`
* `"diff"`: default is `128`
* `"diff_intraline"`: default is `128`
* `"diff"`: default is `10m` (10 MiB of memory)
* `"diff_intraline"`: default is `10m` (10 MiB of memory)
* `"plugin_resources"`: default is 2m (2 MiB of memory)
+
If set to 0 the cache is disabled. Entries are removed immediately
after being stored by the cache. This is primarily useful for testing.
[[cache.name.diskLimit]]cache.<name>.diskLimit::
+
Maximum number of cache items to retain on disk, if this cache
supports storing its items to disk. Like memoryLimit, this is
total number of items, not bytes of disk used. If 0, disk storage
for this cache is disabled.
Total size in bytes of the keys and values stored on disk. Caches that
have grown bigger than this size are scanned daily at 1 AM local
server time to trim the cache. Entries are removed in least recently
accessed order until the cache fits within this limit. Caches may
grow larger than this during the day, as the size check is only
performed once every 24 hours.
+
Default is 16384.
[[cache.name.diskBuffer]]cache.<name>.diskBuffer::
Default is 128 MiB per cache.
+
Number of bytes to buffer in memory before writing less frequently
accessed cache items to disk, if this cache supports storing its
items to disk.
+
Default is 5 MiB.
+
Common unit suffixes of 'k', 'm', or 'g' are supported.
If 0, disk storage for the cache is disabled.
[[cache_names]]Standard Caches
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -447,14 +456,10 @@ Each item caches the differences between two commits, at both the
directory and file levels. Gerrit uses this cache to accelerate
the display of affected file names, as well as file contents.
+
Entries in this cache are relatively large, so the memory limit
should not be set incredibly high. Administrators should try to
target cache.diff.memoryLimit to be roughly the number of changes
which their users will process in a 1 or 2 day span.
+
Keeping entries for 90 days gives sufficient time for most changes
to be submitted or abandoned before their relevant difference items
expire out.
Entries in this cache are relatively large, so memoryLimit is an
estimate in bytes of memory used. Administrators should try to target
cache.diff.memoryLimit to fit all changes users will view in a 1 or 2
day span.
cache `"diff_intraline"`::
+
@ -462,14 +467,10 @@ Each item caches the intraline difference of one file, when compared
between two commits. Gerrit uses this cache to accelerate display of
intraline differences when viewing a file.
+
Entries in this cache are relatively large, so the memory limit
should not be set incredibly high. Administrators should try to
target cache.diff.memoryLimit to be roughly the number of changes
which their users will process in a 1 or 2 day span.
+
Keeping entries for 90 days gives sufficient time for most changes
to be submitted or abandoned before their relevant difference items
expire out.
Entries in this cache are relatively large, so memoryLimit is an
estimate in bytes of memory used. Administrators should try to target
cache.diff.memoryLimit to fit all files users will view in a 1 or 2
day span.
cache `"git_tags"`::
+
@ -517,6 +518,12 @@ reference. Sorting the sections can be expensive when regular
expressions are used, so this cache remembers the ordering for
each branch.
cache `"plugin_resources"`::
+
Caches formatted plugin resources, such as plugin documentation that
has been converted from Markdown to HTML. The memoryLimit refers to
the bytes of memory dedicated to storing the documentation.
cache `"projects"`::
+
Caches the project description records, from the `projects` table
@ -550,8 +557,8 @@ and need to sign-in again after the restart, as the cache was
unable to persist the session information. Enabling a disk cache
is strongly recommended.
+
Session storage is relatively inexpensive, the average entry in
this cache is approximately 248 bytes, depending on the JVM.
Session storage is relatively inexpensive. The average entry in
this cache is approximately 346 bytes.
See also link:cmd-flush-caches.html[gerrit flush-caches].
@ -598,13 +605,6 @@ configuration.
+
Default is true, enabled.
cache.plugin_resources.memoryLimit::
+
Number of bytes of memory to use to cache formatted plugin resources,
such as plugin documentation that has been converted from Markdown to
HTML. Default is 2 MiB. Common unit suffixes of 'k', 'm', or 'g' are
supported.
cache.projects.checkFrequency::
+
How often project configuration should be checked for update from Git.

View File

@ -18,6 +18,7 @@ Included Components
|Google Gson | <<apache2,Apache License 2.0>>
|Google Web Toolkit | <<apache2,Apache License 2.0>>
|Guice | <<apache2,Apache License 2.0>>
|Guava Libraries | <<apache2,Apache License 2.0>>
|Apache Commons Codec | <<apache2,Apache License 2.0>>
|Apache Commons DBCP | <<apache2,Apache License 2.0>>
|Apache Commons Http Client | <<apache2,Apache License 2.0>>
@ -33,7 +34,6 @@ Included Components
|Apache Xerces | <<apache2,Apache License 2.0>>
|OpenID4Java | <<apache2,Apache License 2.0>>
|Neko HTML | <<apache2,Apache License 2.0>>
|Ehcache | <<apache2,Apache License 2.0>>
|mime-util | <<apache2,Apache License 2.0>>
|Jetty | <<apache2,Apache License 2.0>>, or link:http://www.eclipse.org/legal/epl-v10.html[EPL]
|Prolog Cafe | <<prolog_cafe,EPL or GPL>>

View File

@ -14,3 +14,42 @@ Replication
Gerrit 2.5 no longer includes replication support out of the box.
Servers that reply upon `replication.config` to copy Git repository
data to other locations must also install the replication plugin.
Cache Configuration
~~~~~~~~~~~~~~~~~~~
Disk caches are now backed by individual H2 databases, rather than
Ehcache's own private format. Administrators are encouraged to clear
the `'$site_path'/cache` directory before starting the new server.
The `cache.NAME.diskLimit` configuration variable is now expressed in
bytes of disk used. This is a change from previous versions of Gerrit,
which expressed the limit as the number of entries rather than bytes.
Bytes of disk is a more accurate way to size what is held. Admins that
set this variable must update their configurations, as the old values
are too small. For example a setting of `diskLimit = 65535` will only
store 64 KiB worth of data on disk and can no longer hold 65,000 patch
sets. It is recommended to delete the diskLimit variable (if set) and
rely on the built-in default of `128m`.
The `cache.diff.memoryLimit` and `cache.diff_intraline.memoryLimit`
configuration variables are now expressed in bytes of memory used,
rather than number of entries in the cache. This is a change from
previous versions of Gerrit and gives administrators more control over
how memory is partioned within a server. Admins that set this variable
must update their configurations, as the old values are too small.
For example a setting of `memoryLimit = 1024` now means only 1 KiB of
data (which may not even hold 1 patch set), not 1024 patch sets. It
is recommended to set these to `10m` for 10 MiB of memory, and
increase as necessary.
The `cache.NAME.maxAge` variable now means the maximum amount of time
that can elapse between reads of the source data into the cache, no
matter how often it is being accessed. In prior versions it meant how
long an item could be held without being requested by a client before
it was discarded. The new meaning of elapsed time before consulting
the source data is more useful, as it enables a strict bound on how
stale the cached data can be. This is especially useful for slave
servers account and permission data, or the `ldap_groups` cache, where
updates are often made to the source without telling Gerrit to reload
the cache.

View File

@ -1,6 +1,6 @@
/target
/.classpath
/.project
/.settings/org.eclipse.m2e.core.prefs
/.settings/org.maven.ide.eclipse.prefs
/gerrit-ehcache.iml
/.settings/org.eclipse.m2e.core.prefs
/gerrit-cache-h2.iml

View File

@ -1,4 +1,5 @@
#Tue May 15 09:21:09 PDT 2012
#Thu Jul 28 11:02:36 PDT 2011
eclipse.preferences.version=1
encoding//src/main/java=UTF-8
encoding//src/test/java=UTF-8
encoding/<project>=UTF-8

View File

@ -1,4 +1,4 @@
#Thu Jan 19 12:55:44 PST 2012
#Thu Jul 28 11:02:36 PDT 2011
eclipse.preferences.version=1
org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.6

View File

@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (C) 2010 The Android Open Source Project
Copyright (C) 2012 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
@ -25,23 +25,28 @@ limitations under the License.
<version>2.5-SNAPSHOT</version>
</parent>
<artifactId>gerrit-ehcache</artifactId>
<name>Gerrit Code Review - Ehcache Bindings</name>
<artifactId>gerrit-cache-h2</artifactId>
<name>Gerrit Code Review - Guava + H2 caching</name>
<description>
Bindings to Ehcache
Implementation of caching backed by Guava and H2
</description>
<dependencies>
<dependency>
<groupId>net.sf.ehcache</groupId>
<artifactId>ehcache-core</artifactId>
</dependency>
<dependency>
<groupId>com.google.gerrit</groupId>
<artifactId>gerrit-server</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
</dependency>
</dependencies>
</project>

View File

@ -0,0 +1,120 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache.h2;
import com.google.common.base.Strings;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.cache.Weigher;
import com.google.gerrit.lifecycle.LifecycleModule;
import com.google.gerrit.server.cache.CacheBinding;
import com.google.gerrit.server.cache.MemoryCacheFactory;
import com.google.gerrit.server.cache.PersistentCacheFactory;
import com.google.gerrit.server.cache.h2.H2CacheImpl.ValueHolder;
import com.google.gerrit.server.config.ConfigUtil;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.inject.Inject;
import org.eclipse.jgit.lib.Config;
import java.util.concurrent.TimeUnit;
public class DefaultCacheFactory implements MemoryCacheFactory {
public static class Module extends LifecycleModule {
@Override
protected void configure() {
bind(DefaultCacheFactory.class);
bind(MemoryCacheFactory.class).to(DefaultCacheFactory.class);
bind(PersistentCacheFactory.class).to(H2CacheFactory.class);
listener().to(H2CacheFactory.class);
}
}
private final Config cfg;
@Inject
public DefaultCacheFactory(@GerritServerConfig Config config) {
this.cfg = config;
}
@Override
public <K, V> Cache<K, V> build(CacheBinding<K, V> def) {
return create(def, false).build();
}
@Override
public <K, V> LoadingCache<K, V> build(
CacheBinding<K, V> def,
CacheLoader<K, V> loader) {
return create(def, false).build(loader);
}
@SuppressWarnings("unchecked")
<K, V> CacheBuilder<K, V> create(
CacheBinding<K, V> def,
boolean unwrapValueHolder) {
CacheBuilder<K,V> builder = newCacheBuilder();
builder.maximumWeight(cfg.getLong(
"cache", def.name(), "memoryLimit",
def.maximumWeight()));
Weigher<K, V> weigher = def.weigher();
if (weigher != null && unwrapValueHolder) {
final Weigher<K, V> impl = weigher;
weigher = (Weigher<K, V>) new Weigher<K, ValueHolder<V>> () {
@Override
public int weigh(K key, ValueHolder<V> value) {
return impl.weigh(key, value.value);
}
};
} else if (weigher == null) {
weigher = unitWeight();
}
builder.weigher(weigher);
Long age = def.expireAfterWrite(TimeUnit.SECONDS);
if (has(def.name(), "maxAge")) {
builder.expireAfterWrite(ConfigUtil.getTimeUnit(cfg,
"cache", def.name(), "maxAge",
age != null ? age : 0,
TimeUnit.SECONDS), TimeUnit.SECONDS);
} else if (age != null) {
builder.expireAfterWrite(age, TimeUnit.SECONDS);
}
return builder;
}
private boolean has(String name, String var) {
return !Strings.isNullOrEmpty(cfg.getString("cache", name, var));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static <K, V> CacheBuilder<K, V> newCacheBuilder() {
CacheBuilder builder = CacheBuilder.newBuilder();
return builder;
}
private static <K, V> Weigher<K, V> unitWeight() {
return new Weigher<K, V>() {
@Override
public int weigh(K key, V value) {
return 1;
}
};
}
}

View File

@ -0,0 +1,198 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache.h2;
import com.google.common.base.Preconditions;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.Lists;
import com.google.common.util.concurrent.ThreadFactoryBuilder;
import com.google.gerrit.extensions.events.LifecycleListener;
import com.google.gerrit.server.cache.CacheBinding;
import com.google.gerrit.server.cache.PersistentCacheFactory;
import com.google.gerrit.server.cache.h2.H2CacheImpl.SqlStore;
import com.google.gerrit.server.cache.h2.H2CacheImpl.ValueHolder;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.gerrit.server.config.SitePaths;
import com.google.inject.Inject;
import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import org.eclipse.jgit.lib.Config;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.File;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
@Singleton
class H2CacheFactory implements PersistentCacheFactory, LifecycleListener {
static final Logger log = LoggerFactory.getLogger(H2CacheFactory.class);
private final DefaultCacheFactory defaultFactory;
private final Config config;
private final File cacheDir;
private final List<H2CacheImpl<?, ?>> caches;
private final ExecutorService executor;
private final ScheduledExecutorService cleanup;
private volatile boolean started;
@Inject
H2CacheFactory(
DefaultCacheFactory defaultCacheFactory,
@GerritServerConfig Config cfg,
SitePaths site) {
defaultFactory = defaultCacheFactory;
config = cfg;
File loc = site.resolve(cfg.getString("cache", null, "directory"));
if (loc == null) {
cacheDir = null;
} else if (loc.exists() || loc.mkdirs()) {
if (loc.canWrite()) {
log.info("Enabling disk cache " + loc.getAbsolutePath());
cacheDir = loc;
} else {
log.warn("Can't write to disk cache: " + loc.getAbsolutePath());
cacheDir = null;
}
} else {
log.warn("Can't create disk cache: " + loc.getAbsolutePath());
cacheDir = null;
}
caches = Lists.newLinkedList();
if (cacheDir != null) {
executor = Executors.newFixedThreadPool(
1,
new ThreadFactoryBuilder()
.setNameFormat("DiskCache-Store-%d")
.build());
cleanup = Executors.newScheduledThreadPool(
1,
new ThreadFactoryBuilder()
.setNameFormat("DiskCache-Prune-%d")
.setDaemon(true)
.build());
} else {
executor = null;
cleanup = null;
}
}
@Override
public void start() {
started = true;
if (executor != null) {
for (final H2CacheImpl<?, ?> cache : caches) {
executor.execute(new Runnable() {
@Override
public void run() {
cache.start();
}
});
cleanup.schedule(new Runnable() {
@Override
public void run() {
cache.prune(cleanup);
}
}, 30, TimeUnit.SECONDS);
}
}
}
@Override
public void stop() {
if (executor != null) {
try {
cleanup.shutdownNow();
List<Runnable> pending = executor.shutdownNow();
if (executor.awaitTermination(15, TimeUnit.MINUTES)) {
if (pending != null && !pending.isEmpty()) {
log.info(String.format("Finishing %d disk cache updates", pending.size()));
for (Runnable update : pending) {
update.run();
}
}
} else {
log.info("Timeout waiting for disk cache to close");
}
} catch (InterruptedException e) {
log.warn("Interrupted waiting for disk cache to shutdown");
}
}
for (H2CacheImpl<?, ?> cache : caches) {
cache.stop();
}
}
@SuppressWarnings({"unchecked", "rawtypes", "cast"})
@Override
public <K, V> Cache<K, V> build(CacheBinding<K, V> def) {
Preconditions.checkState(!started, "cache must be built before start");
long limit = config.getLong("cache", def.name(), "diskLimit", 128 << 20);
if (cacheDir == null || limit <= 0) {
return defaultFactory.build(def);
}
SqlStore<K, V> store = newSqlStore(def.name(), def.keyType(), limit);
H2CacheImpl<K, V> cache = new H2CacheImpl<K, V>(
executor, store, def.keyType(),
(Cache<K, ValueHolder<V>>) defaultFactory.create(def, true).build());
caches.add(cache);
return cache;
}
@SuppressWarnings("unchecked")
@Override
public <K, V> LoadingCache<K, V> build(
CacheBinding<K, V> def,
CacheLoader<K, V> loader) {
Preconditions.checkState(!started, "cache must be built before start");
long limit = config.getLong("cache", def.name(), "diskLimit", 128 << 20);
if (cacheDir == null || limit <= 0) {
return defaultFactory.build(def, loader);
}
SqlStore<K, V> store = newSqlStore(def.name(), def.keyType(), limit);
Cache<K, ValueHolder<V>> mem = (Cache<K, ValueHolder<V>>)
defaultFactory.create(def, true)
.build((CacheLoader<K, V>) new H2CacheImpl.Loader<K, V>(
executor, store, loader));
H2CacheImpl<K, V> cache = new H2CacheImpl<K, V>(
executor, store, def.keyType(), mem);
caches.add(cache);
return cache;
}
private <V, K> SqlStore<K, V> newSqlStore(
String name,
TypeLiteral<K> keyType,
long maxSize) {
File db = new File(cacheDir, name).getAbsoluteFile();
String url = "jdbc:h2:" + db.toURI().toString();
return new SqlStore<K, V>(url, keyType, maxSize);
}
}

View File

@ -0,0 +1,709 @@
// Copyright 2012 Google Inc. All Rights Reserved.
package com.google.gerrit.server.cache.h2;
import com.google.common.cache.AbstractLoadingCache;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.CacheStats;
import com.google.common.cache.LoadingCache;
import com.google.common.hash.BloomFilter;
import com.google.common.hash.Funnel;
import com.google.common.hash.Funnels;
import com.google.common.hash.PrimitiveSink;
import com.google.inject.TypeLiteral;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.io.OutputStream;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.sql.Timestamp;
import java.util.Calendar;
import java.util.Map;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicLong;
/**
* Hybrid in-memory and database backed cache built on H2.
* <p>
* This cache can be used as either a recall cache, or a loading cache if a
* CacheLoader was supplied to its constructor at build time. Before creating an
* entry the in-memory cache is checked for the item, then the database is
* checked, and finally the CacheLoader is used to construct the item. This is
* mostly useful for CacheLoaders that are computationally intensive, such as
* the PatchListCache.
* <p>
* Cache stores and invalidations are performed on a background thread, hiding
* the latency associated with serializing the key and value pairs and writing
* them to the database log.
* <p>
* A BloomFilter is used around the database to reduce the number of SELECTs
* issued against the database for new cache items that have not been seen
* before, a common operation for the PatchListCache. The BloomFilter is sized
* when the cache starts to be 64,000 entries or double the number of items
* currently in the database table.
* <p>
* This cache does not export its items as a ConcurrentMap.
*
* @see H2CacheFactory
*/
public class H2CacheImpl<K, V> extends AbstractLoadingCache<K, V> {
private static final Logger log = LoggerFactory.getLogger(H2CacheImpl.class);
private final Executor executor;
private final SqlStore<K, V> store;
private final TypeLiteral<K> keyType;
private final Cache<K, ValueHolder<V>> mem;
H2CacheImpl(Executor executor,
SqlStore<K, V> store,
TypeLiteral<K> keyType,
Cache<K, ValueHolder<V>> mem) {
this.executor = executor;
this.store = store;
this.keyType = keyType;
this.mem = mem;
}
@Override
public V getIfPresent(Object objKey) {
if (!keyType.getRawType().isInstance(objKey)) {
return null;
}
@SuppressWarnings("unchecked")
K key = (K) objKey;
ValueHolder<V> h = mem.getIfPresent(key);
if (h != null) {
return h.value;
}
if (store.mightContain(key)) {
h = store.getIfPresent(key);
if (h != null) {
mem.put(key, h);
return h.value;
}
}
return null;
}
@Override
public V get(K key) throws ExecutionException {
if (mem instanceof LoadingCache) {
return ((LoadingCache<K, ValueHolder<V>>) mem).get(key).value;
}
throw new UnsupportedOperationException();
}
@Override
public void put(final K key, V val) {
final ValueHolder<V> h = new ValueHolder<V>(val);
h.created = System.currentTimeMillis();
mem.put(key, h);
executor.execute(new Runnable() {
@Override
public void run() {
store.put(key, h);
}
});
}
@SuppressWarnings("unchecked")
@Override
public void invalidate(final Object key) {
if (keyType.getRawType().isInstance(key) && store.mightContain((K) key)) {
executor.execute(new Runnable() {
@Override
public void run() {
store.invalidate((K) key);
}
});
}
mem.invalidate(key);
}
@Override
public void invalidateAll() {
store.invalidateAll();
mem.invalidateAll();
}
@Override
public long size() {
return mem.size();
}
@Override
public CacheStats stats() {
return mem.stats();
}
public DiskStats diskStats() {
return store.diskStats();
}
void start() {
store.open();
}
void stop() {
for (Map.Entry<K, ValueHolder<V>> e : mem.asMap().entrySet()) {
ValueHolder<V> h = e.getValue();
if (!h.clean) {
store.put(e.getKey(), h);
}
}
store.close();
}
void prune(final ScheduledExecutorService service) {
store.prune(mem);
Calendar cal = Calendar.getInstance();
cal.set(Calendar.HOUR_OF_DAY, 01);
cal.set(Calendar.MINUTE, 0);
cal.set(Calendar.SECOND, 0);
cal.set(Calendar.MILLISECOND, 0);
cal.add(Calendar.DAY_OF_MONTH, 1);
long delay = cal.getTimeInMillis() - System.currentTimeMillis();
service.schedule(new Runnable() {
@Override
public void run() {
prune(service);
}
}, delay, TimeUnit.MILLISECONDS);
}
public static class DiskStats {
long size;
long space;
long hitCount;
long missCount;
public long size() {
return size;
}
public long space() {
return space;
}
public long hitCount() {
return hitCount;
}
public long requestCount() {
return hitCount + missCount;
}
}
static class ValueHolder<V> {
final V value;
long created;
volatile boolean clean;
ValueHolder(V value) {
this.value = value;
}
}
static class Loader<K, V> extends CacheLoader<K, ValueHolder<V>> {
private final Executor executor;
private final SqlStore<K, V> store;
private final CacheLoader<K, V> loader;
Loader(Executor executor, SqlStore<K, V> store, CacheLoader<K, V> loader) {
this.executor = executor;
this.store = store;
this.loader = loader;
}
@Override
public ValueHolder<V> load(final K key) throws Exception {
if (store.mightContain(key)) {
ValueHolder<V> h = store.getIfPresent(key);
if (h != null) {
return h;
}
}
final ValueHolder<V> h = new ValueHolder<V>(loader.load(key));
h.created = System.currentTimeMillis();
executor.execute(new Runnable() {
@Override
public void run() {
store.put(key, h);
}
});
return h;
}
}
private static class KeyType<K> {
String columnType() {
return "OTHER";
}
@SuppressWarnings("unchecked")
K get(ResultSet rs, int col) throws SQLException {
return (K) rs.getObject(col);
}
void set(PreparedStatement ps, int col, K value) throws SQLException {
ps.setObject(col, value);
}
Funnel<K> funnel() {
return new Funnel<K>() {
@Override
public void funnel(K from, PrimitiveSink into) {
try {
ObjectOutputStream ser =
new ObjectOutputStream(new SinkOutputStream(into));
ser.writeObject(from);
ser.flush();
} catch (IOException err) {
throw new RuntimeException("Cannot hash as Serializable", err);
}
}
};
}
@SuppressWarnings("unchecked")
static <K> KeyType<K> create(TypeLiteral<K> type) {
if (type.getRawType() == String.class) {
return (KeyType<K>) STRING;
}
return (KeyType<K>) OTHER;
}
static final KeyType<?> OTHER = new KeyType<Object>();
static final KeyType<String> STRING = new KeyType<String>() {
@Override
String columnType() {
return "VARCHAR(4096)";
}
@Override
String get(ResultSet rs, int col) throws SQLException {
return rs.getString(col);
}
@Override
void set(PreparedStatement ps, int col, String value)
throws SQLException {
ps.setString(col, value);
}
@SuppressWarnings("unchecked")
@Override
Funnel<String> funnel() {
Funnel<?> s = Funnels.stringFunnel();
return (Funnel<String>) s;
}
};
}
static class SqlStore<K, V> {
private final String url;
private final KeyType<K> keyType;
private final long maxSize;
private final BlockingQueue<SqlHandle> handles;
private final AtomicLong hitCount = new AtomicLong();
private final AtomicLong missCount = new AtomicLong();
private volatile BloomFilter<K> bloomFilter;
private int estimatedSize;
SqlStore(String jdbcUrl, TypeLiteral<K> keyType, long maxSize) {
this.url = jdbcUrl;
this.keyType = KeyType.create(keyType);
this.maxSize = maxSize;
int cores = Runtime.getRuntime().availableProcessors();
int keep = Math.min(cores, 16);
this.handles = new ArrayBlockingQueue<SqlHandle>(keep);
}
synchronized void open() {
if (bloomFilter == null) {
bloomFilter = buildBloomFilter();
}
}
void close() {
SqlHandle h;
while ((h = handles.poll()) != null) {
h.close();
}
}
boolean mightContain(K key) {
BloomFilter<K> b = bloomFilter;
if (b == null) {
synchronized (this) {
b = bloomFilter;
if (b == null) {
b = buildBloomFilter();
bloomFilter = b;
}
}
}
return b == null || b.mightContain(key);
}
private BloomFilter<K> buildBloomFilter() {
SqlHandle c = null;
try {
c = acquire();
Statement s = c.conn.createStatement();
try {
ResultSet r;
if (estimatedSize <= 0) {
r = s.executeQuery("SELECT COUNT(*) FROM data");
try {
estimatedSize = r.next() ? r.getInt(1) : 0;
} finally {
r.close();
}
}
BloomFilter<K> b = newBloomFilter();
r = s.executeQuery("SELECT k FROM data");
try {
while (r.next()) {
b.put(keyType.get(r, 1));
}
} finally {
r.close();
}
return b;
} finally {
s.close();
}
} catch (SQLException e) {
log.warn("Cannot build BloomFilter for " + url, e);
c = close(c);
return null;
} finally {
release(c);
}
}
ValueHolder<V> getIfPresent(K key) {
SqlHandle c = null;
try {
c = acquire();
if (c.get == null) {
c.get = c.conn.prepareStatement("SELECT v FROM data WHERE k=?");
}
keyType.set(c.get, 1, key);
ResultSet r = c.get.executeQuery();
try {
if (!r.next()) {
missCount.incrementAndGet();
return null;
}
@SuppressWarnings("unchecked")
V val = (V) r.getObject(1);
ValueHolder<V> h = new ValueHolder<V>(val);
h.clean = true;
hitCount.incrementAndGet();
touch(c, key);
return h;
} finally {
r.close();
c.get.clearParameters();
}
} catch (SQLException e) {
log.warn("Cannot read cache " + url + " for " + key, e);
c = close(c);
return null;
} finally {
release(c);
}
}
private void touch(SqlHandle c, K key) throws SQLException {
if (c.touch == null) {
c.touch =c.conn.prepareStatement("UPDATE data SET accessed=? WHERE k=?");
}
try {
c.touch.setTimestamp(1, new Timestamp(System.currentTimeMillis()));
keyType.set(c.touch, 2, key);
c.touch.executeUpdate();
} finally {
c.touch.clearParameters();
}
}
void put(K key, ValueHolder<V> holder) {
if (holder.clean) {
return;
}
BloomFilter<K> b = bloomFilter;
if (b != null) {
b.put(key);
bloomFilter = b;
}
SqlHandle c = null;
try {
c = acquire();
if (c.put == null) {
c.put = c.conn.prepareStatement("MERGE INTO data VALUES(?,?,?,?)");
}
try {
keyType.set(c.put, 1, key);
c.put.setObject(2, holder.value);
c.put.setTimestamp(3, new Timestamp(holder.created));
c.put.setTimestamp(4, new Timestamp(System.currentTimeMillis()));
c.put.executeUpdate();
holder.clean = true;
} finally {
c.put.clearParameters();
}
} catch (SQLException e) {
log.warn("Cannot put into cache " + url, e);
c = close(c);
} finally {
release(c);
}
}
void invalidate(K key) {
SqlHandle c = null;
try {
c = acquire();
invalidate(c, key);
} catch (SQLException e) {
log.warn("Cannot invalidate cache " + url, e);
c = close(c);
} finally {
release(c);
}
}
private void invalidate(SqlHandle c, K key) throws SQLException {
if (c.invalidate == null) {
c.invalidate = c.conn.prepareStatement("DELETE FROM data WHERE k=?");
}
try {
keyType.set(c.invalidate, 1, key);
c.invalidate.executeUpdate();
} finally {
c.invalidate.clearParameters();
}
}
void invalidateAll() {
SqlHandle c = null;
try {
c = acquire();
Statement s = c.conn.createStatement();
try {
s.executeUpdate("DELETE FROM data");
} finally {
s.close();
}
bloomFilter = newBloomFilter();
} catch (SQLException e) {
log.warn("Cannot invalidate cache " + url, e);
c = close(c);
} finally {
release(c);
}
}
void prune(Cache<K, ?> mem) {
SqlHandle c = null;
try {
c = acquire();
Statement s = c.conn.createStatement();
try {
long used = 0;
ResultSet r = s.executeQuery("SELECT"
+ " SUM(OCTET_LENGTH(k) + OCTET_LENGTH(v))"
+ " FROM data");
try {
used = r.next() ? r.getLong(1) : 0;
} finally {
r.close();
}
if (used <= maxSize) {
return;
}
r = s.executeQuery("SELECT"
+ " k"
+ ",OCTET_LENGTH(k) + OCTET_LENGTH(v)"
+ " FROM data"
+ " ORDER BY accessed");
try {
while (maxSize < used && r.next()) {
K key = keyType.get(r, 1);
if (mem.getIfPresent(key) != null) {
touch(c, key);
} else {
invalidate(c, key);
used -= r.getLong(2);
}
}
} finally {
r.close();
}
} finally {
s.close();
}
} catch (SQLException e) {
log.warn("Cannot prune cache " + url, e);
c = close(c);
} finally {
release(c);
}
}
DiskStats diskStats() {
DiskStats d = new DiskStats();
d.hitCount = hitCount.get();
d.missCount = missCount.get();
SqlHandle c = null;
try {
c = acquire();
Statement s = c.conn.createStatement();
try {
ResultSet r = s.executeQuery("SELECT"
+ " COUNT(*)"
+ ",SUM(OCTET_LENGTH(k) + OCTET_LENGTH(v))"
+ " FROM data");
try {
if (r.next()) {
d.size = r.getLong(1);
d.space = r.getLong(2);
}
} finally {
r.close();
}
} finally {
s.close();
}
} catch (SQLException e) {
log.warn("Cannot get DiskStats for " + url, e);
c = close(c);
} finally {
release(c);
}
return d;
}
private SqlHandle acquire() throws SQLException {
SqlHandle h = handles.poll();
return h != null ? h : new SqlHandle(url, keyType);
}
private void release(SqlHandle h) {
if (h != null && !handles.offer(h)) {
h.close();
}
}
private SqlHandle close(SqlHandle h) {
if (h != null) {
h.close();
}
return null;
}
private BloomFilter<K> newBloomFilter() {
int cnt = Math.max(64 * 1024, 2 * estimatedSize);
return BloomFilter.create(keyType.funnel(), cnt);
}
}
static class SqlHandle {
private final String url;
Connection conn;
PreparedStatement get;
PreparedStatement put;
PreparedStatement touch;
PreparedStatement invalidate;
SqlHandle(String url, KeyType<?> type) throws SQLException {
this.url = url;
this.conn = org.h2.Driver.load().connect(url, null);
Statement stmt = conn.createStatement();
try {
stmt.execute("CREATE TABLE IF NOT EXISTS data"
+ "(k " + type.columnType() + " NOT NULL PRIMARY KEY HASH"
+ ",v OTHER NOT NULL"
+ ",created TIMESTAMP NOT NULL"
+ ",accessed TIMESTAMP NOT NULL"
+ ")");
} finally {
stmt.close();
}
}
void close() {
get = closeStatement(get);
put = closeStatement(put);
touch = closeStatement(touch);
invalidate = closeStatement(invalidate);
if (conn != null) {
try {
conn.close();
} catch (SQLException e) {
log.warn("Cannot close connection to " + url, e);
} finally {
conn = null;
}
}
}
private PreparedStatement closeStatement(PreparedStatement ps) {
if (ps != null) {
try {
ps.close();
} catch (SQLException e) {
log.warn("Cannot close statement for " + url, e);
}
}
return null;
}
}
private static class SinkOutputStream extends OutputStream {
private final PrimitiveSink sink;
SinkOutputStream(PrimitiveSink sink) {
this.sink = sink;
}
@Override
public void write(int b) {
sink.putByte((byte)b);
}
@Override
public void write(byte[] b, int p, int n) {
sink.putBytes(b, p, n);
}
}
}

View File

@ -1,272 +0,0 @@
// Copyright (C) 2009 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.ehcache;
import static java.util.concurrent.TimeUnit.MINUTES;
import static java.util.concurrent.TimeUnit.SECONDS;
import com.google.gerrit.extensions.events.LifecycleListener;
import com.google.gerrit.lifecycle.LifecycleModule;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.CachePool;
import com.google.gerrit.server.cache.CacheProvider;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gerrit.server.cache.EvictionPolicy;
import com.google.gerrit.server.cache.ProxyCache;
import com.google.gerrit.server.config.ConfigUtil;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.gerrit.server.config.SitePaths;
import com.google.inject.Inject;
import com.google.inject.Singleton;
import net.sf.ehcache.CacheManager;
import net.sf.ehcache.Ehcache;
import net.sf.ehcache.config.CacheConfiguration;
import net.sf.ehcache.config.Configuration;
import net.sf.ehcache.config.DiskStoreConfiguration;
import net.sf.ehcache.store.MemoryStoreEvictionPolicy;
import org.eclipse.jgit.lib.Config;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.File;
import java.util.HashMap;
import java.util.Map;
/** Pool of all declared caches created by {@link CacheModule}s. */
@Singleton
public class EhcachePoolImpl implements CachePool {
private static final Logger log =
LoggerFactory.getLogger(EhcachePoolImpl.class);
public static class Module extends LifecycleModule {
@Override
protected void configure() {
bind(CachePool.class).to(EhcachePoolImpl.class);
bind(EhcachePoolImpl.class);
listener().to(EhcachePoolImpl.Lifecycle.class);
}
}
public static class Lifecycle implements LifecycleListener {
private final EhcachePoolImpl cachePool;
@Inject
Lifecycle(final EhcachePoolImpl cachePool) {
this.cachePool = cachePool;
}
@Override
public void start() {
cachePool.start();
}
@Override
public void stop() {
cachePool.stop();
}
}
private final Config config;
private final SitePaths site;
private final Object lock = new Object();
private final Map<String, CacheProvider<?, ?>> caches;
private CacheManager manager;
@Inject
EhcachePoolImpl(@GerritServerConfig final Config cfg, final SitePaths site) {
this.config = cfg;
this.site = site;
this.caches = new HashMap<String, CacheProvider<?, ?>>();
}
@SuppressWarnings({"rawtypes", "unchecked"})
private void start() {
synchronized (lock) {
if (manager != null) {
throw new IllegalStateException("Cache pool has already been started");
}
try {
System.setProperty("net.sf.ehcache.skipUpdateCheck", "" + true);
} catch (SecurityException e) {
// Ignore it, the system is just going to ping some external page
// using a background thread and there's not much we can do about
// it now.
}
manager = new CacheManager(new Factory().toConfiguration());
for (CacheProvider<?, ?> p : caches.values()) {
Ehcache eh = manager.getEhcache(p.getName());
EntryCreator<?, ?> c = p.getEntryCreator();
if (c != null) {
p.bind(new PopulatingCache(eh, c));
} else {
p.bind(new SimpleCache(eh));
}
}
}
}
private void stop() {
synchronized (lock) {
if (manager != null) {
manager.shutdown();
}
}
}
/** <i>Discouraged</i> Get the underlying cache descriptions, for statistics. */
public CacheManager getCacheManager() {
synchronized (lock) {
return manager;
}
}
public <K, V> ProxyCache<K, V> register(final CacheProvider<K, V> provider) {
synchronized (lock) {
if (manager != null) {
throw new IllegalStateException("Cache pool has already been started");
}
final String n = provider.getName();
if (caches.containsKey(n) && caches.get(n) != provider) {
throw new IllegalStateException("Cache \"" + n + "\" already defined");
}
caches.put(n, provider);
return new ProxyCache<K, V>();
}
}
private class Factory {
private static final int MB = 1024 * 1024;
private final Configuration mgr = new Configuration();
Configuration toConfiguration() {
configureDiskStore();
configureDefaultCache();
for (CacheProvider<?, ?> p : caches.values()) {
final String name = p.getName();
final CacheConfiguration c = newCache(name);
c.setMemoryStoreEvictionPolicyFromObject(toPolicy(p.evictionPolicy()));
c.setMaxElementsInMemory(getInt(name, "memorylimit", p.memoryLimit()));
c.setTimeToIdleSeconds(0);
c.setTimeToLiveSeconds(getSeconds(name, "maxage", p.maxAge()));
c.setEternal(c.getTimeToLiveSeconds() == 0);
if (p.disk() && mgr.getDiskStoreConfiguration() != null) {
c.setMaxElementsOnDisk(getInt(name, "disklimit", p.diskLimit()));
int v = c.getDiskSpoolBufferSizeMB() * MB;
v = getInt(name, "diskbuffer", v) / MB;
c.setDiskSpoolBufferSizeMB(Math.max(1, v));
c.setOverflowToDisk(c.getMaxElementsOnDisk() > 0);
c.setDiskPersistent(c.getMaxElementsOnDisk() > 0);
}
mgr.addCache(c);
}
return mgr;
}
private MemoryStoreEvictionPolicy toPolicy(final EvictionPolicy policy) {
switch (policy) {
case LFU:
return MemoryStoreEvictionPolicy.LFU;
case LRU:
return MemoryStoreEvictionPolicy.LRU;
default:
throw new IllegalArgumentException("Unsupported " + policy);
}
}
private int getInt(String n, String s, int d) {
return config.getInt("cache", n, s, d);
}
private long getSeconds(String n, String s, long d) {
d = MINUTES.convert(d, SECONDS);
long m = ConfigUtil.getTimeUnit(config, "cache", n, s, d, MINUTES);
return SECONDS.convert(m, MINUTES);
}
private void configureDiskStore() {
boolean needDisk = false;
for (CacheProvider<?, ?> p : caches.values()) {
if (p.disk()) {
needDisk = true;
break;
}
}
if (!needDisk) {
return;
}
File loc = site.resolve(config.getString("cache", null, "directory"));
if (loc == null) {
} else if (loc.exists() || loc.mkdirs()) {
if (loc.canWrite()) {
final DiskStoreConfiguration c = new DiskStoreConfiguration();
c.setPath(loc.getAbsolutePath());
mgr.addDiskStore(c);
log.info("Enabling disk cache " + loc.getAbsolutePath());
} else {
log.warn("Can't write to disk cache: " + loc.getAbsolutePath());
}
} else {
log.warn("Can't create disk cache: " + loc.getAbsolutePath());
}
}
private CacheConfiguration newConfiguration() {
CacheConfiguration c = new CacheConfiguration();
c.setMaxElementsInMemory(1024);
c.setMemoryStoreEvictionPolicyFromObject(MemoryStoreEvictionPolicy.LFU);
c.setTimeToIdleSeconds(0);
c.setTimeToLiveSeconds(0 /* infinite */);
c.setEternal(true);
if (mgr.getDiskStoreConfiguration() != null) {
c.setMaxElementsOnDisk(16384);
c.setOverflowToDisk(false);
c.setDiskPersistent(false);
c.setDiskSpoolBufferSizeMB(5);
c.setDiskExpiryThreadIntervalSeconds(60 * 60);
}
return c;
}
private void configureDefaultCache() {
mgr.setDefaultCacheConfiguration(newConfiguration());
}
private CacheConfiguration newCache(final String name) {
CacheConfiguration c = newConfiguration();
c.setName(name);
return c;
}
}
}

View File

@ -1,114 +0,0 @@
// Copyright (C) 2008 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.ehcache;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.EntryCreator;
import net.sf.ehcache.CacheException;
import net.sf.ehcache.Ehcache;
import net.sf.ehcache.Element;
import net.sf.ehcache.constructs.blocking.CacheEntryFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* A decorator for {@link Cache} which automatically constructs missing entries.
* <p>
* On a cache miss {@link EntryCreator#createEntry(Object)} is invoked, allowing
* the application specific subclass to compute the entry and return it for
* caching. During a miss the cache takes a lock related to the missing key,
* ensuring that at most one thread performs the creation work, and other
* threads wait for the result. Concurrent creations are possible if two
* different keys miss and hash to different locks in the internal lock table.
*
* @param <K> type of key used to name cache entries.
* @param <V> type of value stored within a cache entry.
*/
class PopulatingCache<K, V> implements Cache<K, V> {
private static final Logger log =
LoggerFactory.getLogger(PopulatingCache.class);
private final net.sf.ehcache.constructs.blocking.SelfPopulatingCache self;
private final EntryCreator<K, V> creator;
PopulatingCache(Ehcache s, EntryCreator<K, V> entryCreator) {
creator = entryCreator;
final CacheEntryFactory f = new CacheEntryFactory() {
@SuppressWarnings("unchecked")
@Override
public Object createEntry(Object key) throws Exception {
return creator.createEntry((K) key);
}
};
self = new net.sf.ehcache.constructs.blocking.SelfPopulatingCache(s, f);
}
/**
* Get the element from the cache, or {@link EntryCreator#missing(Object)} if not found.
* <p>
* The {@link EntryCreator#missing(Object)} method is only invoked if:
* <ul>
* <li>{@code key == null}, in which case the application should return a
* suitable return value that callers can accept, or throw a RuntimeException.
* <li>{@code createEntry(key)} threw an exception, in which case the entry
* was not stored in the cache. An entry was recorded in the application log,
* but a return value is still required.
* <li>The cache has been shutdown, and access is forbidden.
* </ul>
*
* @param key key to locate.
* @return either the cached entry, or {@code missing(key)} if not found.
*/
@SuppressWarnings("unchecked")
public V get(final K key) {
if (key == null) {
return creator.missing(key);
}
final Element m;
try {
m = self.get(key);
} catch (IllegalStateException err) {
log.error("Cannot lookup " + key + " in \"" + self.getName() + "\"", err);
return creator.missing(key);
} catch (CacheException err) {
log.error("Cannot lookup " + key + " in \"" + self.getName() + "\"", err);
return creator.missing(key);
}
return m != null ? (V) m.getObjectValue() : creator.missing(key);
}
public void remove(final K key) {
if (key != null) {
self.remove(key);
}
}
/** Remove all cached items, forcing them to be created again on demand. */
public void removeAll() {
self.removeAll();
}
public void put(K key, V value) {
self.put(new Element(key, value));
}
@Override
public String toString() {
return "Cache[" + self.getName() + "]";
}
}

View File

@ -1,81 +0,0 @@
// Copyright (C) 2009 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.ehcache;
import com.google.gerrit.server.cache.Cache;
import net.sf.ehcache.CacheException;
import net.sf.ehcache.Ehcache;
import net.sf.ehcache.Element;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* A fast in-memory and/or on-disk based cache.
*
* @type <K> type of key used to lookup entries in the cache.
* @type <V> type of value stored within each cache entry.
*/
final class SimpleCache<K, V> implements Cache<K, V> {
private static final Logger log = LoggerFactory.getLogger(SimpleCache.class);
private final Ehcache self;
SimpleCache(final Ehcache self) {
this.self = self;
}
Ehcache getEhcache() {
return self;
}
@SuppressWarnings("unchecked")
public V get(final K key) {
if (key == null) {
return null;
}
final Element m;
try {
m = self.get(key);
} catch (IllegalStateException err) {
log.error("Cannot lookup " + key + " in \"" + self.getName() + "\"", err);
return null;
} catch (CacheException err) {
log.error("Cannot lookup " + key + " in \"" + self.getName() + "\"", err);
return null;
}
return m != null ? (V) m.getObjectValue() : null;
}
public void put(final K key, final V value) {
self.put(new Element(key, value));
}
public void remove(final K key) {
if (key != null) {
self.remove(key);
}
}
public void removeAll() {
self.removeAll();
}
@Override
public String toString() {
return "Cache[" + self.getName() + "]";
}
}

View File

@ -26,14 +26,11 @@ import com.google.gerrit.server.AnonymousUser;
import com.google.gerrit.server.CurrentUser;
import com.google.gerrit.server.IdentifiedUser;
import com.google.gerrit.server.account.AuthResult;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EvictionPolicy;
import com.google.gerrit.server.config.AuthConfig;
import com.google.inject.Inject;
import com.google.inject.Module;
import com.google.inject.Provider;
import com.google.inject.TypeLiteral;
import com.google.inject.servlet.RequestScoped;
import javax.servlet.http.Cookie;
@ -49,13 +46,9 @@ public final class CacheBasedWebSession implements WebSession {
return new CacheModule() {
@Override
protected void configure() {
final String cacheName = WebSessionManager.CACHE_NAME;
final TypeLiteral<Cache<Key, Val>> type =
new TypeLiteral<Cache<Key, Val>>() {};
disk(type, cacheName) //
.memoryLimit(1024) // reasonable default for many sites
.maxAge(MAX_AGE_MINUTES, MINUTES) // expire sessions if they are inactive
.evictionPolicy(EvictionPolicy.LRU) // keep most recently used
persist(WebSessionManager.CACHE_NAME, String.class, Val.class)
.maximumWeight(1024) // reasonable default for many sites
.expireAfterWrite(MAX_AGE_MINUTES, MINUTES) // expire sessions if they are inactive
;
bind(WebSessionManager.class);
bind(WebSession.class)

View File

@ -14,13 +14,13 @@
package com.google.gerrit.httpd;
import com.google.common.cache.Cache;
import com.google.gerrit.common.data.Capable;
import com.google.gerrit.reviewdb.client.Project;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.AccessPath;
import com.google.gerrit.server.AnonymousUser;
import com.google.gerrit.server.IdentifiedUser;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.git.AsyncReceiveCommits;
import com.google.gerrit.server.git.GitRepositoryManager;
@ -99,11 +99,11 @@ public class GitOverHttpServlet extends GitServlet {
install(new CacheModule() {
@Override
protected void configure() {
TypeLiteral<Cache<AdvertisedObjectsCacheKey, Set<ObjectId>>> cache =
new TypeLiteral<Cache<AdvertisedObjectsCacheKey, Set<ObjectId>>>() {};
core(cache, ID_CACHE)
.memoryLimit(4096)
.maxAge(10, TimeUnit.MINUTES);
cache(ID_CACHE,
AdvertisedObjectsCacheKey.class,
new TypeLiteral<Set<ObjectId>>() {})
.maximumWeight(4096)
.expireAfterWrite(10, TimeUnit.MINUTES);
}
});
}
@ -320,12 +320,12 @@ public class GitOverHttpServlet extends GitServlet {
if (isGet) {
rc.advertiseHistory();
cache.remove(cacheKey);
cache.invalidate(cacheKey);
} else {
Set<ObjectId> ids = cache.get(cacheKey);
Set<ObjectId> ids = cache.getIfPresent(cacheKey);
if (ids != null) {
rp.getAdvertisedObjects().addAll(ids);
cache.remove(cacheKey);
cache.invalidate(cacheKey);
}
}

View File

@ -26,9 +26,9 @@ import static java.util.concurrent.TimeUnit.HOURS;
import static java.util.concurrent.TimeUnit.MILLISECONDS;
import static java.util.concurrent.TimeUnit.MINUTES;
import com.google.common.cache.Cache;
import com.google.gerrit.reviewdb.client.Account;
import com.google.gerrit.reviewdb.client.AccountExternalId;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.config.ConfigUtil;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.inject.Inject;
@ -55,11 +55,11 @@ class WebSessionManager {
private final long sessionMaxAgeMillis;
private final SecureRandom prng;
private final Cache<Key, Val> self;
private final Cache<String, Val> self;
@Inject
WebSessionManager(@GerritServerConfig Config cfg,
@Named(CACHE_NAME) final Cache<Key, Val> cache) {
@Named(CACHE_NAME) final Cache<String, Val> cache) {
prng = new SecureRandom();
self = cache;
@ -76,7 +76,7 @@ class WebSessionManager {
prng.nextBytes(rnd);
buf = new ByteArrayOutputStream(3 + nonceLen);
writeVarInt32(buf, (int) Key.serialVersionUID);
writeVarInt32(buf, (int) Val.serialVersionUID);
writeVarInt32(buf, who.get());
writeBytes(buf, rnd);
@ -120,7 +120,7 @@ class WebSessionManager {
Val val = new Val(who, refreshCookieAt, remember,
lastLogin, xsrfToken, expiresAt);
self.put(key, val);
self.put(key.token, val);
return val;
}
@ -141,21 +141,19 @@ class WebSessionManager {
}
Val get(final Key key) {
Val val = self.get(key);
Val val = self.getIfPresent(key.token);
if (val != null && val.expiresAt <= now()) {
self.remove(key);
self.invalidate(key.token);
return null;
}
return val;
}
void destroy(final Key key) {
self.remove(key);
self.invalidate(key.token);
}
static final class Key implements Serializable {
static final long serialVersionUID = 2L;
static final class Key {
private transient String token;
Key(final String t) {
@ -175,18 +173,10 @@ class WebSessionManager {
public boolean equals(Object obj) {
return obj instanceof Key && token.equals(((Key) obj).token);
}
private void writeObject(final ObjectOutputStream out) throws IOException {
writeString(out, token);
}
private void readObject(final ObjectInputStream in) throws IOException {
token = readString(in);
}
}
static final class Val implements Serializable {
static final long serialVersionUID = Key.serialVersionUID;
static final long serialVersionUID = 2L;
private transient Account.Id accountId;
private transient long refreshCookieAt;

View File

@ -14,6 +14,7 @@
package com.google.gerrit.httpd.plugins;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.plugins.ModuleGenerator;
import com.google.gerrit.server.plugins.ReloadPluginListener;
import com.google.gerrit.server.plugins.StartPluginListener;
@ -21,6 +22,8 @@ import com.google.inject.internal.UniqueAnnotations;
import com.google.inject.servlet.ServletModule;
public class HttpPluginModule extends ServletModule {
static final String PLUGIN_RESOURCES = "plugin_resources";
@Override
protected void configureServlets() {
bind(HttpPluginServlet.class);
@ -36,5 +39,14 @@ public class HttpPluginModule extends ServletModule {
bind(ModuleGenerator.class)
.to(HttpAutoRegisterModuleGenerator.class);
install(new CacheModule() {
@Override
protected void configure() {
cache(PLUGIN_RESOURCES, ResourceKey.class, Resource.class)
.maximumWeight(2 << 20)
.weigher(ResourceWeigher.class);
}
});
}
}

View File

@ -16,8 +16,6 @@ package com.google.gerrit.httpd.plugins;
import com.google.common.base.Strings;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.Weigher;
import com.google.common.collect.Lists;
import com.google.common.collect.Maps;
import com.google.gerrit.extensions.registration.RegistrationHandle;
@ -32,6 +30,7 @@ import com.google.gerrit.server.ssh.SshInfo;
import com.google.inject.Inject;
import com.google.inject.Provider;
import com.google.inject.Singleton;
import com.google.inject.name.Named;
import com.google.inject.servlet.GuiceFilter;
import org.eclipse.jgit.lib.Config;
@ -57,7 +56,6 @@ import java.util.jar.JarFile;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import javax.annotation.Nullable;
import javax.servlet.FilterChain;
import javax.servlet.ServletConfig;
import javax.servlet.ServletException;
@ -90,22 +88,12 @@ class HttpPluginServlet extends HttpServlet
@Inject
HttpPluginServlet(MimeUtilFileTypeRegistry mimeUtil,
@CanonicalWebUrl Provider<String> webUrl,
@Named(HttpPluginModule.PLUGIN_RESOURCES) Cache<ResourceKey, Resource> cache,
@GerritServerConfig Config cfg,
SshInfo sshInfo) {
this.mimeUtil = mimeUtil;
this.webUrl = webUrl;
this.resourceCache = CacheBuilder.newBuilder()
.maximumWeight(cfg.getInt(
"cache", "plugin_resources", "memoryLimit",
2 * 1024 * 1024))
.weigher(new Weigher<ResourceKey, Resource>() {
@Override
public int weigh(ResourceKey key, Resource value) {
return key.weight() + value.weight();
}
})
.build();
this.resourceCache = cache;
String sshHost = "review.example.com";
int sshPort = 29418;
@ -247,8 +235,8 @@ class HttpPluginServlet extends HttpServlet
if (exists(entry)) {
sendResource(jar, entry, key, res);
} else {
resourceCache.put(key, NOT_FOUND);
NOT_FOUND.send(req, res);
resourceCache.put(key, Resource.NOT_FOUND);
Resource.NOT_FOUND.send(req, res);
}
} else if (file.equals("Documentation")) {
res.sendRedirect(uri + "/index.html");
@ -268,12 +256,12 @@ class HttpPluginServlet extends HttpServlet
} else if (exists(entry)) {
sendResource(jar, entry, key, res);
} else {
resourceCache.put(key, NOT_FOUND);
NOT_FOUND.send(req, res);
resourceCache.put(key, Resource.NOT_FOUND);
Resource.NOT_FOUND.send(req, res);
}
} else {
resourceCache.put(key, NOT_FOUND);
NOT_FOUND.send(req, res);
resourceCache.put(key, Resource.NOT_FOUND);
Resource.NOT_FOUND.send(req, res);
}
}
@ -559,7 +547,7 @@ class HttpPluginServlet extends HttpServlet
return 0 <= s ? path.substring(1, s) : path.substring(1);
}
private static void noCache(HttpServletResponse res) {
static void noCache(HttpServletResponse res) {
res.setHeader("Expires", "Fri, 01 Jan 1980 00:00:00 GMT");
res.setHeader("Pragma", "no-cache");
res.setHeader("Cache-Control", "no-cache, must-revalidate");
@ -576,99 +564,6 @@ class HttpPluginServlet extends HttpServlet
}
}
private static final class ResourceKey {
private final Plugin.CacheKey plugin;
private final String resource;
ResourceKey(Plugin p, String r) {
this.plugin = p.getCacheKey();
this.resource = r;
}
int weight() {
return 28 + resource.length();
}
@Override
public int hashCode() {
return plugin.hashCode() * 31 + resource.hashCode();
}
@Override
public boolean equals(Object other) {
if (other instanceof ResourceKey) {
ResourceKey rk = (ResourceKey) other;
return plugin == rk.plugin && resource.equals(rk.resource);
}
return false;
}
}
private static abstract class Resource {
abstract int weight();
abstract void send(HttpServletRequest req, HttpServletResponse res)
throws IOException;
}
private static final class SmallResource extends Resource {
private final byte[] data;
private String contentType;
private String characterEncoding;
private long lastModified;
SmallResource(byte[] data) {
this.data = data;
}
SmallResource setLastModified(long when) {
this.lastModified = when;
return this;
}
SmallResource setContentType(String contentType) {
this.contentType = contentType;
return this;
}
SmallResource setCharacterEncoding(@Nullable String enc) {
this.characterEncoding = enc;
return this;
}
@Override
int weight() {
return data.length;
}
@Override
void send(HttpServletRequest req, HttpServletResponse res)
throws IOException {
if (0 < lastModified) {
res.setDateHeader("Last-Modified", lastModified);
}
res.setContentType(contentType);
if (characterEncoding != null) {
res.setCharacterEncoding(characterEncoding);
}
res.setContentLength(data.length);
res.getOutputStream().write(data);
}
}
private static final Resource NOT_FOUND = new Resource() {
@Override
int weight() {
return 4;
}
@Override
void send(HttpServletRequest req, HttpServletResponse res)
throws IOException {
noCache(res);
res.sendError(HttpServletResponse.SC_NOT_FOUND);
}
};
private static class WrappedRequest extends HttpServletRequestWrapper {
private final String contextPath;

View File

@ -0,0 +1,40 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.httpd.plugins;
import java.io.IOException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
abstract class Resource {
static final Resource NOT_FOUND = new Resource() {
@Override
int weigh() {
return 0;
}
@Override
void send(HttpServletRequest req, HttpServletResponse res)
throws IOException {
HttpPluginServlet.noCache(res);
res.sendError(HttpServletResponse.SC_NOT_FOUND);
}
};
abstract int weigh();
abstract void send(HttpServletRequest req, HttpServletResponse res)
throws IOException;
}

View File

@ -0,0 +1,45 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.httpd.plugins;
import com.google.gerrit.server.plugins.Plugin;
final class ResourceKey {
private final Plugin.CacheKey plugin;
private final String resource;
ResourceKey(Plugin p, String r) {
this.plugin = p.getCacheKey();
this.resource = r;
}
int weigh() {
return resource.length() * 2;
}
@Override
public int hashCode() {
return plugin.hashCode() * 31 + resource.hashCode();
}
@Override
public boolean equals(Object other) {
if (other instanceof ResourceKey) {
ResourceKey rk = (ResourceKey) other;
return plugin == rk.plugin && resource.equals(rk.resource);
}
return false;
}
}

View File

@ -1,4 +1,4 @@
// Copyright (C) 2009 The Android Open Source Project
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@ -12,11 +12,13 @@
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache;
package com.google.gerrit.httpd.plugins;
import com.google.common.cache.Weigher;
/** Configure a cache declared within a {@link CacheModule} instance. */
public interface UnnamedCacheBinding<K, V> {
/** Set the name of the cache. */
public NamedCacheBinding<K, V> name(String cacheName);
class ResourceWeigher implements Weigher<ResourceKey, Resource> {
@Override
public int weigh(ResourceKey key, Resource value) {
return key.weigh() + value.weigh();
}
}

View File

@ -0,0 +1,66 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.httpd.plugins;
import java.io.IOException;
import javax.annotation.Nullable;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
final class SmallResource extends Resource {
private final byte[] data;
private String contentType;
private String characterEncoding;
private long lastModified;
SmallResource(byte[] data) {
this.data = data;
}
SmallResource setLastModified(long when) {
this.lastModified = when;
return this;
}
SmallResource setContentType(String contentType) {
this.contentType = contentType;
return this;
}
SmallResource setCharacterEncoding(@Nullable String enc) {
this.characterEncoding = enc;
return this;
}
@Override
int weigh() {
return contentType.length() * 2 + data.length;
}
@Override
void send(HttpServletRequest req, HttpServletResponse res)
throws IOException {
if (0 < lastModified) {
res.setDateHeader("Last-Modified", lastModified);
}
res.setContentType(contentType);
if (characterEncoding != null) {
res.setCharacterEncoding(characterEncoding);
}
res.setContentLength(data.length);
res.getOutputStream().write(data);
}
}

View File

@ -31,6 +31,7 @@ import com.google.gerrit.server.IdentifiedUser;
import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListCache;
import com.google.gerrit.server.patch.PatchListKey;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.gerrit.server.patch.PatchSetInfoFactory;
import com.google.gerrit.server.patch.PatchSetInfoNotAvailableException;
import com.google.gerrit.server.project.ChangeControl;
@ -108,18 +109,19 @@ class PatchSetDetailFactory extends Handler<PatchSetDetail> {
final PatchList list;
if (psIdBase != null) {
oldId = toObjectId(psIdBase);
newId = toObjectId(psIdNew);
try {
if (psIdBase != null) {
oldId = toObjectId(psIdBase);
newId = toObjectId(psIdNew);
projectKey = control.getProject().getNameKey();
projectKey = control.getProject().getNameKey();
list = listFor(keyFor(diffPrefs.getIgnoreWhitespace()));
} else { // OK, means use base to compare
list = patchListCache.get(control.getChange(), patchSet);
if (list == null) {
throw new NoSuchEntityException();
list = listFor(keyFor(diffPrefs.getIgnoreWhitespace()));
} else { // OK, means use base to compare
list = patchListCache.get(control.getChange(), patchSet);
}
} catch (PatchListNotAvailableException e) {
throw new NoSuchEntityException();
}
final List<Patch> patches = list.toPatchList(patchSet.getId());
@ -185,7 +187,8 @@ class PatchSetDetailFactory extends Handler<PatchSetDetail> {
return new PatchListKey(projectKey, oldId, newId, whitespace);
}
private PatchList listFor(final PatchListKey key) {
private PatchList listFor(PatchListKey key)
throws PatchListNotAvailableException {
return patchListCache.get(key);
}
}

View File

@ -35,6 +35,7 @@ import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListCache;
import com.google.gerrit.server.patch.PatchListEntry;
import com.google.gerrit.server.patch.PatchListKey;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.gerrit.server.project.ChangeControl;
import com.google.gerrit.server.project.NoSuchChangeException;
import com.google.gwtorm.server.OrmException;
@ -154,12 +155,12 @@ class PatchScriptFactory extends Handler<PatchScript> {
content.getOldName(), //
content.getNewName());
try {
return b.toPatchScript(content, comments, history);
} catch (IOException e) {
log.error("File content unavailable", e);
throw new NoSuchChangeException(changeId, e);
}
} catch (PatchListNotAvailableException e) {
throw new NoSuchChangeException(changeId, e);
} catch (IOException e) {
log.error("File content unavailable", e);
throw new NoSuchChangeException(changeId, e);
} finally {
git.close();
}
@ -169,7 +170,8 @@ class PatchScriptFactory extends Handler<PatchScript> {
return new PatchListKey(projectKey, aId, bId, whitespace);
}
private PatchList listFor(final PatchListKey key) {
private PatchList listFor(final PatchListKey key)
throws PatchListNotAvailableException {
return patchListCache.get(key);
}

View File

@ -17,7 +17,6 @@ package com.google.gerrit.pgm;
import static com.google.gerrit.server.schema.DataSourceProvider.Context.MULTI_USER;
import com.google.gerrit.common.ChangeHookRunner;
import com.google.gerrit.ehcache.EhcachePoolImpl;
import com.google.gerrit.httpd.CacheBasedWebSession;
import com.google.gerrit.httpd.GitOverHttpModule;
import com.google.gerrit.httpd.HttpCanonicalWebUrlProvider;
@ -36,6 +35,7 @@ import com.google.gerrit.pgm.util.LogFileCompressor;
import com.google.gerrit.pgm.util.RuntimeShutdown;
import com.google.gerrit.pgm.util.SiteProgram;
import com.google.gerrit.reviewdb.client.AuthType;
import com.google.gerrit.server.cache.h2.DefaultCacheFactory;
import com.google.gerrit.server.config.AuthConfig;
import com.google.gerrit.server.config.AuthConfigModule;
import com.google.gerrit.server.config.CanonicalWebUrlModule;
@ -209,7 +209,7 @@ public class Daemon extends SiteProgram {
modules.add(new ChangeHookRunner.Module());
modules.add(new ReceiveCommitsExecutorModule());
modules.add(cfgInjector.getInstance(GerritGlobalModule.class));
modules.add(new EhcachePoolImpl.Module());
modules.add(new DefaultCacheFactory.Module());
modules.add(new SmtpEmailSender.Module());
modules.add(new SignedTokenEmailTokenVerifier.Module());
modules.add(new PluginModule());

View File

@ -17,7 +17,6 @@ package com.google.gerrit.pgm;
import static com.google.gerrit.server.schema.DataSourceProvider.Context.MULTI_USER;
import com.google.gerrit.common.data.ApprovalTypes;
import com.google.gerrit.ehcache.EhcachePoolImpl;
import com.google.gerrit.lifecycle.LifecycleManager;
import com.google.gerrit.lifecycle.LifecycleModule;
import com.google.gerrit.pgm.util.SiteProgram;
@ -27,7 +26,7 @@ import com.google.gerrit.reviewdb.client.Project;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.account.AccountCacheImpl;
import com.google.gerrit.server.account.GroupCacheImpl;
import com.google.gerrit.server.cache.CachePool;
import com.google.gerrit.server.cache.h2.DefaultCacheFactory;
import com.google.gerrit.server.config.ApprovalTypesProvider;
import com.google.gerrit.server.config.CanonicalWebUrl;
import com.google.gerrit.server.config.CanonicalWebUrlProvider;
@ -100,7 +99,7 @@ public class ExportReviewNotes extends SiteProgram {
install(AccountCacheImpl.module());
install(GroupCacheImpl.module());
install(new EhcachePoolImpl.Module());
install(new DefaultCacheFactory.Module());
install(new FactoryModule() {
@Override
protected void configure() {

View File

@ -62,7 +62,6 @@ limitations under the License.
<excludes>
<exclude>gwtexpui:gwtexpui</exclude>
<exclude>gwtjsonrpc:gwtjsonrpc</exclude>
<exclude>com.google.gerrit:gerrit-ehcache</exclude>
<exclude>com.google.gerrit:gerrit-prettify</exclude>
<exclude>com.google.gerrit:gerrit-patch-commonsnet</exclude>
<exclude>com.google.gerrit:gerrit-patch-jgit</exclude>
@ -82,7 +81,6 @@ limitations under the License.
<exclude>asm:asm</exclude>
<exclude>eu.medsea.mimeutil:mime-util</exclude>
<exclude>net.sf.ehcache:ehcache-core</exclude>
<exclude>org.antlr:antlr</exclude>
<exclude>org.antlr:antlr-runtime</exclude>
<exclude>org.apache.mina:mina-core</exclude>

View File

@ -109,6 +109,11 @@ limitations under the License.
<artifactId>aopalliance</artifactId>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>
<dependency>
<groupId>com.google.gerrit</groupId>
<artifactId>gerrit-antlr</artifactId>

View File

@ -30,6 +30,7 @@ import com.google.gerrit.server.git.GitRepositoryManager;
import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListCache;
import com.google.gerrit.server.patch.PatchListKey;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.gerrit.server.patch.PatchSetInfoFactory;
import com.google.gerrit.server.patch.PatchSetInfoNotAvailableException;
import com.google.gerrit.server.project.ChangeControl;
@ -80,8 +81,10 @@ public final class StoredValues {
ObjectId b = ObjectId.fromString(psInfo.getRevId());
Whitespace ws = Whitespace.IGNORE_NONE;
PatchListKey plKey = new PatchListKey(projectKey, a, b, ws);
PatchList patchList = plCache.get(plKey);
if (patchList == null) {
PatchList patchList;
try {
patchList = plCache.get(plKey);
} catch (PatchListNotAvailableException e) {
throw new SystemException("Cannot create " + plKey);
}
return patchList;

View File

@ -14,12 +14,14 @@
package com.google.gerrit.server.account;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import com.google.gerrit.reviewdb.client.Account;
import com.google.gerrit.reviewdb.client.AccountExternalId;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gwtorm.server.SchemaFactory;
import com.google.inject.Inject;
import com.google.inject.Module;
@ -27,45 +29,58 @@ import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.concurrent.ExecutionException;
/** Translates an email address to a set of matching accounts. */
@Singleton
public class AccountByEmailCacheImpl implements AccountByEmailCache {
private static final Logger log = LoggerFactory
.getLogger(AccountByEmailCacheImpl.class);
private static final String CACHE_NAME = "accounts_byemail";
public static Module module() {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<String, Set<Account.Id>>> type =
new TypeLiteral<Cache<String, Set<Account.Id>>>() {};
core(type, CACHE_NAME).populateWith(Loader.class);
cache(CACHE_NAME,
String.class,
new TypeLiteral<Set<Account.Id>>() {})
.loader(Loader.class);
bind(AccountByEmailCacheImpl.class);
bind(AccountByEmailCache.class).to(AccountByEmailCacheImpl.class);
}
};
}
private final Cache<String, Set<Account.Id>> cache;
private final LoadingCache<String, Set<Account.Id>> cache;
@Inject
AccountByEmailCacheImpl(
@Named(CACHE_NAME) final Cache<String, Set<Account.Id>> cache) {
@Named(CACHE_NAME) LoadingCache<String, Set<Account.Id>> cache) {
this.cache = cache;
}
public Set<Account.Id> get(final String email) {
return cache.get(email);
try {
return cache.get(email);
} catch (ExecutionException e) {
log.warn("Cannot resolve accounts by email", e);
return Collections.emptySet();
}
}
public void evict(final String email) {
cache.remove(email);
if (email != null) {
cache.invalidate(email);
}
}
static class Loader extends EntryCreator<String, Set<Account.Id>> {
static class Loader extends CacheLoader<String, Set<Account.Id>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -74,10 +89,10 @@ public class AccountByEmailCacheImpl implements AccountByEmailCache {
}
@Override
public Set<Account.Id> createEntry(final String email) throws Exception {
public Set<Account.Id> load(String email) throws Exception {
final ReviewDb db = schema.open();
try {
final HashSet<Account.Id> r = new HashSet<Account.Id>();
Set<Account.Id> r = Sets.newHashSet();
for (Account a : db.accounts().byPreferredEmail(email)) {
r.add(a.getId());
}
@ -85,30 +100,10 @@ public class AccountByEmailCacheImpl implements AccountByEmailCache {
.byEmailAddress(email)) {
r.add(a.getAccountId());
}
return pack(r);
return ImmutableSet.copyOf(r);
} finally {
db.close();
}
}
@Override
public Set<Account.Id> missing(final String key) {
return Collections.emptySet();
}
private static Set<Account.Id> pack(final Set<Account.Id> c) {
switch (c.size()) {
case 0:
return Collections.emptySet();
case 1:
return one(c);
default:
return Collections.unmodifiableSet(new HashSet<Account.Id>(c));
}
}
private static <T> Set<T> one(final Set<T> c) {
return Collections.singleton(c.iterator().next());
}
}
}

View File

@ -14,14 +14,16 @@
package com.google.gerrit.server.account;
import com.google.common.base.Optional;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.ImmutableSet;
import com.google.gerrit.reviewdb.client.Account;
import com.google.gerrit.reviewdb.client.AccountExternalId;
import com.google.gerrit.reviewdb.client.AccountGroup;
import com.google.gerrit.reviewdb.client.AccountGroupMember;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gwtorm.server.OrmException;
import com.google.gwtorm.server.SchemaFactory;
import com.google.inject.Inject;
@ -30,14 +32,21 @@ import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.concurrent.ExecutionException;
/** Caches important (but small) account state to avoid database hits. */
@Singleton
public class AccountCacheImpl implements AccountCache {
private static final Logger log = LoggerFactory
.getLogger(AccountCacheImpl.class);
private static final String BYID_NAME = "accounts";
private static final String BYUSER_NAME = "accounts_byname";
@ -45,13 +54,13 @@ public class AccountCacheImpl implements AccountCache {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<Account.Id, AccountState>> byIdType =
new TypeLiteral<Cache<Account.Id, AccountState>>() {};
core(byIdType, BYID_NAME).populateWith(ByIdLoader.class);
cache(BYID_NAME, Account.Id.class, AccountState.class)
.loader(ByIdLoader.class);
final TypeLiteral<Cache<String, Account.Id>> byUsernameType =
new TypeLiteral<Cache<String, Account.Id>>() {};
core(byUsernameType, BYUSER_NAME).populateWith(ByNameLoader.class);
cache(BYUSER_NAME,
String.class,
new TypeLiteral<Optional<Account.Id>>() {})
.loader(ByNameLoader.class);
bind(AccountCacheImpl.class);
bind(AccountCache.class).to(AccountCacheImpl.class);
@ -59,54 +68,76 @@ public class AccountCacheImpl implements AccountCache {
};
}
private final Cache<Account.Id, AccountState> byId;
private final Cache<String, Account.Id> byName;
private final LoadingCache<Account.Id, AccountState> byId;
private final LoadingCache<String, Optional<Account.Id>> byName;
@Inject
AccountCacheImpl(@Named(BYID_NAME) Cache<Account.Id, AccountState> byId,
@Named(BYUSER_NAME) Cache<String, Account.Id> byUsername) {
AccountCacheImpl(@Named(BYID_NAME) LoadingCache<Account.Id, AccountState> byId,
@Named(BYUSER_NAME) LoadingCache<String, Optional<Account.Id>> byUsername) {
this.byId = byId;
this.byName = byUsername;
}
public AccountState get(final Account.Id accountId) {
return byId.get(accountId);
public AccountState get(Account.Id accountId) {
try {
return byId.get(accountId);
} catch (ExecutionException e) {
log.warn("Cannot load AccountState for " + accountId, e);
return missing(accountId);
}
}
@Override
public AccountState getByUsername(String username) {
Account.Id id = byName.get(username);
return id != null ? byId.get(id) : null;
try {
Optional<Account.Id> id = byName.get(username);
return id != null && id.isPresent() ? byId.get(id.get()) : null;
} catch (ExecutionException e) {
log.warn("Cannot load AccountState for " + username, e);
return null;
}
}
public void evict(final Account.Id accountId) {
byId.remove(accountId);
public void evict(Account.Id accountId) {
if (accountId != null) {
byId.invalidate(accountId);
}
}
public void evictByUsername(String username) {
byName.remove(username);
if (username != null) {
byName.invalidate(username);
}
}
static class ByIdLoader extends EntryCreator<Account.Id, AccountState> {
private static AccountState missing(Account.Id accountId) {
Account account = new Account(accountId);
Collection<AccountExternalId> ids = Collections.emptySet();
Set<AccountGroup.UUID> anon = ImmutableSet.of(AccountGroup.ANONYMOUS_USERS);
return new AccountState(account, anon, ids);
}
static class ByIdLoader extends CacheLoader<Account.Id, AccountState> {
private final SchemaFactory<ReviewDb> schema;
private final GroupCache groupCache;
private final Cache<String, Account.Id> byName;
private final LoadingCache<String, Optional<Account.Id>> byName;
@Inject
ByIdLoader(SchemaFactory<ReviewDb> sf, GroupCache groupCache,
@Named(BYUSER_NAME) Cache<String, Account.Id> byUsername) {
@Named(BYUSER_NAME) LoadingCache<String, Optional<Account.Id>> byUsername) {
this.schema = sf;
this.groupCache = groupCache;
this.byName = byUsername;
}
@Override
public AccountState createEntry(final Account.Id key) throws Exception {
public AccountState load(Account.Id key) throws Exception {
final ReviewDb db = schema.open();
try {
final AccountState state = load(db, key);
if (state.getUserName() != null) {
byName.put(state.getUserName(), state.getAccount().getId());
String user = state.getUserName();
if (user != null) {
byName.put(user, Optional.of(state.getAccount().getId()));
}
return state;
} finally {
@ -142,18 +173,9 @@ public class AccountCacheImpl implements AccountCache {
return new AccountState(account, internalGroups, externalIds);
}
@Override
public AccountState missing(final Account.Id accountId) {
final Account account = new Account(accountId);
final Collection<AccountExternalId> ids = Collections.emptySet();
final Set<AccountGroup.UUID> anonymous =
Collections.singleton(AccountGroup.ANONYMOUS_USERS);
return new AccountState(account, anonymous, ids);
}
}
static class ByNameLoader extends EntryCreator<String, Account.Id> {
static class ByNameLoader extends CacheLoader<String, Optional<Account.Id>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -162,14 +184,17 @@ public class AccountCacheImpl implements AccountCache {
}
@Override
public Account.Id createEntry(final String username) throws Exception {
public Optional<Account.Id> load(String username) throws Exception {
final ReviewDb db = schema.open();
try {
final AccountExternalId.Key key = new AccountExternalId.Key( //
AccountExternalId.SCHEME_USERNAME, //
username);
final AccountExternalId id = db.accountExternalIds().get(key);
return id != null ? id.getAccountId() : null;
if (id != null) {
return Optional.of(id.getAccountId());
}
return Optional.absent();
} finally {
db.close();
}

View File

@ -14,12 +14,16 @@
package com.google.gerrit.server.account;
import com.google.common.base.Optional;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.ImmutableList;
import com.google.gerrit.reviewdb.client.AccountGroup;
import com.google.gerrit.reviewdb.client.AccountGroupName;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gwtorm.server.OrmDuplicateKeyException;
import com.google.gwtorm.server.OrmException;
import com.google.gwtorm.server.SchemaFactory;
import com.google.inject.Inject;
import com.google.inject.Module;
@ -27,48 +31,48 @@ import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import java.util.ArrayList;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import java.util.SortedSet;
import java.util.TreeSet;
import java.util.concurrent.locks.Lock;
import java.util.concurrent.locks.ReentrantLock;
import java.util.concurrent.ExecutionException;
/** Tracks group objects in memory for efficient access. */
@Singleton
public class GroupCacheImpl implements GroupCache {
private static final Logger log = LoggerFactory
.getLogger(GroupCacheImpl.class);
private static final String BYID_NAME = "groups";
private static final String BYNAME_NAME = "groups_byname";
private static final String BYUUID_NAME = "groups_byuuid";
private static final String BYEXT_NAME = "groups_byext";
private static final String BYNAME_LIST = "groups_byname_list";
public static Module module() {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<AccountGroup.Id, AccountGroup>> byId =
new TypeLiteral<Cache<AccountGroup.Id, AccountGroup>>() {};
core(byId, BYID_NAME).populateWith(ByIdLoader.class);
cache(BYID_NAME,
AccountGroup.Id.class,
new TypeLiteral<Optional<AccountGroup>>() {})
.loader(ByIdLoader.class);
final TypeLiteral<Cache<AccountGroup.NameKey, AccountGroup>> byName =
new TypeLiteral<Cache<AccountGroup.NameKey, AccountGroup>>() {};
core(byName, BYNAME_NAME).populateWith(ByNameLoader.class);
cache(BYNAME_NAME,
String.class,
new TypeLiteral<Optional<AccountGroup>>() {})
.loader(ByNameLoader.class);
final TypeLiteral<Cache<AccountGroup.UUID, AccountGroup>> byUUID =
new TypeLiteral<Cache<AccountGroup.UUID, AccountGroup>>() {};
core(byUUID, BYUUID_NAME).populateWith(ByUUIDLoader.class);
cache(BYUUID_NAME,
String.class,
new TypeLiteral<Optional<AccountGroup>>() {})
.loader(ByUUIDLoader.class);
final TypeLiteral<Cache<AccountGroup.ExternalNameKey, Collection<AccountGroup>>> byExternalName =
new TypeLiteral<Cache<AccountGroup.ExternalNameKey, Collection<AccountGroup>>>() {};
core(byExternalName, BYEXT_NAME) //
.populateWith(ByExternalNameLoader.class);
final TypeLiteral<Cache<ListKey, SortedSet<AccountGroup.NameKey>>> listType =
new TypeLiteral<Cache<ListKey, SortedSet<AccountGroup.NameKey>>>() {};
core(listType, BYNAME_LIST).populateWith(Lister.class);
cache(BYEXT_NAME,
String.class,
new TypeLiteral<Collection<AccountGroup>>() {})
.loader(ByExternalNameLoader.class);
bind(GroupCacheImpl.class);
bind(GroupCache.class).to(GroupCacheImpl.class);
@ -76,94 +80,126 @@ public class GroupCacheImpl implements GroupCache {
};
}
private final Cache<AccountGroup.Id, AccountGroup> byId;
private final Cache<AccountGroup.NameKey, AccountGroup> byName;
private final Cache<AccountGroup.UUID, AccountGroup> byUUID;
private final Cache<AccountGroup.ExternalNameKey, Collection<AccountGroup>> byExternalName;
private final Cache<ListKey,SortedSet<AccountGroup.NameKey>> list;
private final Lock listLock;
private final LoadingCache<AccountGroup.Id, Optional<AccountGroup>> byId;
private final LoadingCache<String, Optional<AccountGroup>> byName;
private final LoadingCache<String, Optional<AccountGroup>> byUUID;
private final LoadingCache<String, Collection<AccountGroup>> byExternalName;
private final SchemaFactory<ReviewDb> schema;
@Inject
GroupCacheImpl(
@Named(BYID_NAME) Cache<AccountGroup.Id, AccountGroup> byId,
@Named(BYNAME_NAME) Cache<AccountGroup.NameKey, AccountGroup> byName,
@Named(BYUUID_NAME) Cache<AccountGroup.UUID, AccountGroup> byUUID,
@Named(BYEXT_NAME) Cache<AccountGroup.ExternalNameKey, Collection<AccountGroup>> byExternalName,
@Named(BYNAME_LIST) final Cache<ListKey, SortedSet<AccountGroup.NameKey>> list) {
@Named(BYID_NAME) LoadingCache<AccountGroup.Id, Optional<AccountGroup>> byId,
@Named(BYNAME_NAME) LoadingCache<String, Optional<AccountGroup>> byName,
@Named(BYUUID_NAME) LoadingCache<String, Optional<AccountGroup>> byUUID,
@Named(BYEXT_NAME) LoadingCache<String, Collection<AccountGroup>> byExternalName,
SchemaFactory<ReviewDb> schema) {
this.byId = byId;
this.byName = byName;
this.byUUID = byUUID;
this.byExternalName = byExternalName;
this.list = list;
this.listLock = new ReentrantLock(true /* fair */);
this.schema = schema;
}
public AccountGroup get(final AccountGroup.Id groupId) {
return byId.get(groupId);
try {
Optional<AccountGroup> g = byId.get(groupId);
return g.isPresent() ? g.get() : missing(groupId);
} catch (ExecutionException e) {
log.warn("Cannot load group "+groupId, e);
return missing(groupId);
}
}
public void evict(final AccountGroup group) {
byId.remove(group.getId());
byName.remove(group.getNameKey());
byUUID.remove(group.getGroupUUID());
byExternalName.remove(group.getExternalNameKey());
if (group.getId() != null) {
byId.invalidate(group.getId());
}
if (group.getNameKey() != null) {
byName.invalidate(group.getNameKey().get());
}
if (group.getGroupUUID() != null) {
byUUID.invalidate(group.getGroupUUID().get());
}
if (group.getExternalNameKey() != null) {
byExternalName.invalidate(group.getExternalNameKey().get());
}
}
public void evictAfterRename(final AccountGroup.NameKey oldName,
final AccountGroup.NameKey newName) {
byName.remove(oldName);
updateGroupList(oldName, newName);
if (oldName != null) {
byName.invalidate(oldName.get());
}
if (newName != null) {
byName.invalidate(newName.get());
}
}
public AccountGroup get(final AccountGroup.NameKey name) {
return byName.get(name);
public AccountGroup get(AccountGroup.NameKey name) {
if (name == null) {
return null;
}
try {
return byName.get(name.get()).orNull();
} catch (ExecutionException e) {
log.warn(String.format("Cannot lookup group %s by name", name.get()), e);
return null;
}
}
public AccountGroup get(final AccountGroup.UUID uuid) {
return byUUID.get(uuid);
public AccountGroup get(AccountGroup.UUID uuid) {
if (uuid == null) {
return null;
}
try {
return byUUID.get(uuid.get()).orNull();
} catch (ExecutionException e) {
log.warn(String.format("Cannot lookup group %s by name", uuid.get()), e);
return null;
}
}
public Collection<AccountGroup> get(
final AccountGroup.ExternalNameKey externalName) {
return byExternalName.get(externalName);
public Collection<AccountGroup> get(AccountGroup.ExternalNameKey name) {
if (name == null) {
return Collections.emptyList();
}
try {
return byExternalName.get(name.get());
} catch (ExecutionException e) {
log.warn("Cannot lookup external group " + name, e);
return Collections.emptyList();
}
}
@Override
public Iterable<AccountGroup> all() {
final List<AccountGroup> groups = new ArrayList<AccountGroup>();
for (final AccountGroup.NameKey groupName : list.get(ListKey.ALL)) {
final AccountGroup group = get(groupName);
if (group != null) {
groups.add(group);
try {
ReviewDb db = schema.open();
try {
return Collections.unmodifiableList(db.accountGroups().all().toList());
} finally {
db.close();
}
} catch (OrmException e) {
log.warn("Cannot list internal groups", e);
return Collections.emptyList();
}
return Collections.unmodifiableList(groups);
}
@Override
public void onCreateGroup(final AccountGroup.NameKey newGroupName) {
updateGroupList(null, newGroupName);
public void onCreateGroup(AccountGroup.NameKey newGroupName) {
byName.invalidate(newGroupName.get());
}
private void updateGroupList(final AccountGroup.NameKey nameToRemove,
final AccountGroup.NameKey nameToAdd) {
listLock.lock();
try {
SortedSet<AccountGroup.NameKey> n = list.get(ListKey.ALL);
n = new TreeSet<AccountGroup.NameKey>(n);
if (nameToRemove != null) {
n.remove(nameToRemove);
}
if (nameToAdd != null) {
n.add(nameToAdd);
}
list.put(ListKey.ALL, Collections.unmodifiableSortedSet(n));
} finally {
listLock.unlock();
}
private static AccountGroup missing(AccountGroup.Id key) {
AccountGroup.NameKey name = new AccountGroup.NameKey("Deleted Group" + key);
AccountGroup g = new AccountGroup(name, key, null);
g.setType(AccountGroup.Type.SYSTEM);
return g;
}
static class ByIdLoader extends EntryCreator<AccountGroup.Id, AccountGroup> {
static class ByIdLoader extends
CacheLoader<AccountGroup.Id, Optional<AccountGroup>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -172,32 +208,18 @@ public class GroupCacheImpl implements GroupCache {
}
@Override
public AccountGroup createEntry(final AccountGroup.Id key) throws Exception {
public Optional<AccountGroup> load(final AccountGroup.Id key)
throws Exception {
final ReviewDb db = schema.open();
try {
final AccountGroup group = db.accountGroups().get(key);
if (group != null) {
return group;
} else {
return missing(key);
}
return Optional.fromNullable(db.accountGroups().get(key));
} finally {
db.close();
}
}
@Override
public AccountGroup missing(final AccountGroup.Id key) {
final AccountGroup.NameKey name =
new AccountGroup.NameKey("Deleted Group" + key.toString());
final AccountGroup g = new AccountGroup(name, key, null);
g.setType(AccountGroup.Type.SYSTEM);
return g;
}
}
static class ByNameLoader extends
EntryCreator<AccountGroup.NameKey, AccountGroup> {
static class ByNameLoader extends CacheLoader<String, Optional<AccountGroup>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -206,25 +228,23 @@ public class GroupCacheImpl implements GroupCache {
}
@Override
public AccountGroup createEntry(final AccountGroup.NameKey key)
public Optional<AccountGroup> load(String name)
throws Exception {
final AccountGroupName r;
final ReviewDb db = schema.open();
try {
r = db.accountGroupNames().get(key);
AccountGroup.NameKey key = new AccountGroup.NameKey(name);
AccountGroupName r = db.accountGroupNames().get(key);
if (r != null) {
return db.accountGroups().get(r.getId());
} else {
return null;
return Optional.fromNullable(db.accountGroups().get(r.getId()));
}
return Optional.absent();
} finally {
db.close();
}
}
}
static class ByUUIDLoader extends
EntryCreator<AccountGroup.UUID, AccountGroup> {
static class ByUUIDLoader extends CacheLoader<String, Optional<AccountGroup>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -233,15 +253,19 @@ public class GroupCacheImpl implements GroupCache {
}
@Override
public AccountGroup createEntry(final AccountGroup.UUID uuid)
public Optional<AccountGroup> load(String uuid)
throws Exception {
final ReviewDb db = schema.open();
try {
List<AccountGroup> r = db.accountGroups().byUUID(uuid).toList();
List<AccountGroup> r;
r = db.accountGroups().byUUID(new AccountGroup.UUID(uuid)).toList();
if (r.size() == 1) {
return r.get(0);
return Optional.of(r.get(0));
} else if (r.size() == 0) {
return Optional.absent();
} else {
return null;
throw new OrmDuplicateKeyException("Duplicate group UUID " + uuid);
}
} finally {
db.close();
@ -250,7 +274,7 @@ public class GroupCacheImpl implements GroupCache {
}
static class ByExternalNameLoader extends
EntryCreator<AccountGroup.ExternalNameKey, Collection<AccountGroup>> {
CacheLoader<String, Collection<AccountGroup>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -259,45 +283,13 @@ public class GroupCacheImpl implements GroupCache {
}
@Override
public Collection<AccountGroup> createEntry(
final AccountGroup.ExternalNameKey key) throws Exception {
final ReviewDb db = schema.open();
try {
return db.accountGroups().byExternalName(key).toList();
} finally {
db.close();
}
}
}
static class ListKey {
static final ListKey ALL = new ListKey();
private ListKey() {
}
}
static class Lister extends EntryCreator<ListKey, SortedSet<AccountGroup.NameKey>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
Lister(final SchemaFactory<ReviewDb> sf) {
schema = sf;
}
@Override
public SortedSet<AccountGroup.NameKey> createEntry(ListKey key)
public Collection<AccountGroup> load(String name)
throws Exception {
final ReviewDb db = schema.open();
try {
final List<AccountGroupName> groupNames =
db.accountGroupNames().all().toList();
final SortedSet<AccountGroup.NameKey> groups =
new TreeSet<AccountGroup.NameKey>();
for (final AccountGroupName groupName : groupNames) {
groups.add(groupName.getNameKey());
}
return Collections.unmodifiableSortedSet(groups);
return ImmutableList.copyOf(db.accountGroups()
.byExternalName(new AccountGroup.ExternalNameKey(name))
.toList());
} finally {
db.close();
}

View File

@ -14,12 +14,14 @@
package com.google.gerrit.server.account;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import com.google.gerrit.reviewdb.client.AccountGroup;
import com.google.gerrit.reviewdb.client.AccountGroupInclude;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gwtorm.server.SchemaFactory;
import com.google.inject.Inject;
import com.google.inject.Module;
@ -27,24 +29,30 @@ import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.concurrent.ExecutionException;
/** Tracks group inclusions in memory for efficient access. */
@Singleton
public class GroupIncludeCacheImpl implements GroupIncludeCache {
private static final Logger log = LoggerFactory
.getLogger(GroupIncludeCacheImpl.class);
private static final String BYINCLUDE_NAME = "groups_byinclude";
public static Module module() {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<AccountGroup.UUID, Collection<AccountGroup.UUID>>> byInclude =
new TypeLiteral<Cache<AccountGroup.UUID, Collection<AccountGroup.UUID>>>() {};
core(byInclude, BYINCLUDE_NAME).populateWith(ByIncludeLoader.class);
cache(BYINCLUDE_NAME,
AccountGroup.UUID.class,
new TypeLiteral<Set<AccountGroup.UUID>>() {})
.loader(ByIncludeLoader.class);
bind(GroupIncludeCacheImpl.class);
bind(GroupIncludeCache.class).to(GroupIncludeCacheImpl.class);
@ -52,24 +60,31 @@ public class GroupIncludeCacheImpl implements GroupIncludeCache {
};
}
private final Cache<AccountGroup.UUID, Collection<AccountGroup.UUID>> byInclude;
private final LoadingCache<AccountGroup.UUID, Set<AccountGroup.UUID>> byInclude;
@Inject
GroupIncludeCacheImpl(
@Named(BYINCLUDE_NAME) Cache<AccountGroup.UUID, Collection<AccountGroup.UUID>> byInclude) {
@Named(BYINCLUDE_NAME) LoadingCache<AccountGroup.UUID, Set<AccountGroup.UUID>> byInclude) {
this.byInclude = byInclude;
}
public Collection<AccountGroup.UUID> getByInclude(AccountGroup.UUID groupId) {
return byInclude.get(groupId);
try {
return byInclude.get(groupId);
} catch (ExecutionException e) {
log.warn("Cannot load included groups", e);
return Collections.emptySet();
}
}
public void evictInclude(AccountGroup.UUID groupId) {
byInclude.remove(groupId);
if (groupId != null) {
byInclude.invalidate(groupId);
}
}
static class ByIncludeLoader extends
EntryCreator<AccountGroup.UUID, Collection<AccountGroup.UUID>> {
CacheLoader<AccountGroup.UUID, Set<AccountGroup.UUID>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -78,32 +93,28 @@ public class GroupIncludeCacheImpl implements GroupIncludeCache {
}
@Override
public Collection<AccountGroup.UUID> createEntry(final AccountGroup.UUID key) throws Exception {
public Set<AccountGroup.UUID> load(AccountGroup.UUID key) throws Exception {
final ReviewDb db = schema.open();
try {
List<AccountGroup> group = db.accountGroups().byUUID(key).toList();
if (group.size() != 1) {
return Collections.emptyList();
return Collections.emptySet();
}
Set<AccountGroup.Id> ids = new HashSet<AccountGroup.Id>();
for (AccountGroupInclude agi : db.accountGroupIncludes().byInclude(group.get(0).getId())) {
Set<AccountGroup.Id> ids = Sets.newHashSet();
for (AccountGroupInclude agi : db.accountGroupIncludes()
.byInclude(group.get(0).getId())) {
ids.add(agi.getGroupId());
}
Set<AccountGroup.UUID> groupArray = new HashSet<AccountGroup.UUID> ();
Set<AccountGroup.UUID> groupArray = Sets.newHashSet();
for (AccountGroup g : db.accountGroups().get(ids)) {
groupArray.add(g.getGroupUUID());
}
return Collections.unmodifiableCollection(groupArray);
return ImmutableSet.copyOf(groupArray);
} finally {
db.close();
}
}
@Override
public Collection<AccountGroup.UUID> missing(final AccountGroup.UUID key) {
return Collections.emptyList();
}
}
}

View File

@ -16,10 +16,10 @@ package com.google.gerrit.server.auth.ldap;
import static java.util.concurrent.TimeUnit.HOURS;
import com.google.common.base.Optional;
import com.google.gerrit.reviewdb.client.Account;
import com.google.gerrit.reviewdb.client.AccountGroup;
import com.google.gerrit.server.account.Realm;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.inject.Scopes;
import com.google.inject.TypeLiteral;
@ -32,15 +32,16 @@ public class LdapModule extends CacheModule {
@Override
protected void configure() {
final TypeLiteral<Cache<String, Set<AccountGroup.UUID>>> groups =
new TypeLiteral<Cache<String, Set<AccountGroup.UUID>>>() {};
core(groups, GROUP_CACHE).maxAge(1, HOURS) //
.populateWith(LdapRealm.MemberLoader.class);
cache(GROUP_CACHE,
String.class,
new TypeLiteral<Set<AccountGroup.UUID>>() {})
.expireAfterWrite(1, HOURS)
.loader(LdapRealm.MemberLoader.class);
final TypeLiteral<Cache<String, Account.Id>> usernames =
new TypeLiteral<Cache<String, Account.Id>>() {};
core(usernames, USERNAME_CACHE) //
.populateWith(LdapRealm.UserLoader.class);
cache(USERNAME_CACHE,
String.class,
new TypeLiteral<Optional<Account.Id>>() {})
.loader(LdapRealm.UserLoader.class);
bind(Realm.class).to(LdapRealm.class).in(Scopes.SINGLETON);
bind(Helper.class);

View File

@ -16,6 +16,9 @@ package com.google.gerrit.server.auth.ldap;
import static com.google.gerrit.reviewdb.client.AccountExternalId.SCHEME_GERRIT;
import com.google.common.base.Optional;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.Iterables;
import com.google.gerrit.common.data.ParameterizedString;
import com.google.gerrit.reviewdb.client.Account;
@ -32,12 +35,9 @@ import com.google.gerrit.server.account.MaterializedGroupMembership;
import com.google.gerrit.server.account.Realm;
import com.google.gerrit.server.auth.AuthenticationUnavailableException;
import com.google.gerrit.server.auth.ldap.Helper.LdapSchema;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gerrit.server.config.AuthConfig;
import com.google.gerrit.server.config.ConfigUtil;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.gwtorm.server.OrmException;
import com.google.gwtorm.server.SchemaFactory;
import com.google.inject.Inject;
import com.google.inject.Singleton;
@ -56,6 +56,7 @@ import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ExecutionException;
import javax.naming.NamingException;
import javax.naming.directory.DirContext;
@ -70,11 +71,11 @@ class LdapRealm implements Realm {
private final Helper helper;
private final AuthConfig authConfig;
private final EmailExpander emailExpander;
private final Cache<String, Account.Id> usernameCache;
private final LoadingCache<String, Optional<Account.Id>> usernameCache;
private final Set<Account.FieldName> readOnlyAccountFields;
private final Config config;
private final Cache<String, Set<AccountGroup.UUID>> membershipCache;
private final LoadingCache<String, Set<AccountGroup.UUID>> membershipCache;
private final MaterializedGroupMembership.Factory groupMembershipFactory;
@Inject
@ -82,8 +83,8 @@ class LdapRealm implements Realm {
final Helper helper,
final AuthConfig authConfig,
final EmailExpander emailExpander,
@Named(LdapModule.GROUP_CACHE) final Cache<String, Set<AccountGroup.UUID>> membershipCache,
@Named(LdapModule.USERNAME_CACHE) final Cache<String, Account.Id> usernameCache,
@Named(LdapModule.GROUP_CACHE) final LoadingCache<String, Set<AccountGroup.UUID>> membershipCache,
@Named(LdapModule.USERNAME_CACHE) final LoadingCache<String, Optional<Account.Id>> usernameCache,
@GerritServerConfig final Config config,
final MaterializedGroupMembership.Factory groupMembershipFactory) {
this.helper = helper;
@ -261,13 +262,21 @@ class LdapRealm implements Realm {
@Override
public void onCreateAccount(final AuthRequest who, final Account account) {
usernameCache.put(who.getLocalUser(), account.getId());
usernameCache.put(who.getLocalUser(), Optional.of(account.getId()));
}
@Override
public GroupMembership groups(final AccountState who) {
String id = findId(who.getExternalIds());
Set<AccountGroup.UUID> groups;
try {
groups = membershipCache.get(id);
} catch (ExecutionException e) {
log.warn(String.format("Cannot lookup groups for %s in LDAP", id), e);
groups = Collections.emptySet();
}
return groupMembershipFactory.create(Iterables.concat(
membershipCache.get(findId(who.getExternalIds())),
groups,
who.getInternalGroups()));
}
@ -281,8 +290,14 @@ class LdapRealm implements Realm {
}
@Override
public Account.Id lookup(final String accountName) {
return usernameCache.get(accountName);
public Account.Id lookup(String accountName) {
try {
Optional<Account.Id> id = usernameCache.get(accountName);
return id != null ? id.orNull() : null;
} catch (ExecutionException e) {
log.warn(String.format("Cannot lookup account %s in LDAP", accountName), e);
return null;
}
}
@Override
@ -319,7 +334,7 @@ class LdapRealm implements Realm {
return out;
}
static class UserLoader extends EntryCreator<String, Account.Id> {
static class UserLoader extends CacheLoader<String, Optional<Account.Id>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -328,25 +343,23 @@ class LdapRealm implements Realm {
}
@Override
public Account.Id createEntry(final String username) throws Exception {
public Optional<Account.Id> load(String username) throws Exception {
final ReviewDb db = schema.open();
try {
final ReviewDb db = schema.open();
try {
final AccountExternalId extId =
db.accountExternalIds().get(
new AccountExternalId.Key(SCHEME_GERRIT, username));
return extId != null ? extId.getAccountId() : null;
} finally {
db.close();
final AccountExternalId extId =
db.accountExternalIds().get(
new AccountExternalId.Key(SCHEME_GERRIT, username));
if (extId != null) {
return Optional.of(extId.getAccountId());
}
} catch (OrmException e) {
log.warn("Cannot query for username in database", e);
return null;
return Optional.absent();
} finally {
db.close();
}
}
}
static class MemberLoader extends EntryCreator<String, Set<AccountGroup.UUID>> {
static class MemberLoader extends CacheLoader<String, Set<AccountGroup.UUID>> {
private final Helper helper;
@Inject
@ -355,8 +368,7 @@ class LdapRealm implements Realm {
}
@Override
public Set<AccountGroup.UUID> createEntry(final String username)
throws Exception {
public Set<AccountGroup.UUID> load(String username) throws Exception {
final DirContext ctx = helper.open();
try {
return helper.queryForGroups(ctx, username, null);
@ -368,10 +380,5 @@ class LdapRealm implements Realm {
}
}
}
@Override
public Set<AccountGroup.UUID> missing(final String key) {
return Collections.emptySet();
}
}
}

View File

@ -1,35 +0,0 @@
// Copyright (C) 2009 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache;
/**
* A fast in-memory and/or on-disk based cache.
*
* @type <K> type of key used to lookup entries in the cache.
* @type <V> type of value stored within each cache entry.
*/
public interface Cache<K, V> {
/** Get the element from the cache, or null if not stored in the cache. */
public V get(K key);
/** Put one element into the cache, replacing any existing value. */
public void put(K key, V value);
/** Remove any existing value from the cache, no-op if not present. */
public void remove(K key);
/** Remove all cached items. */
public void removeAll();
}

View File

@ -0,0 +1,46 @@
// Copyright (C) 2009 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.Weigher;
import com.google.inject.TypeLiteral;
import java.util.concurrent.TimeUnit;
import javax.annotation.Nullable;
/** Configure a cache declared within a {@link CacheModule} instance. */
public interface CacheBinding<K, V> {
/** Set the total size of the cache. */
CacheBinding<K, V> maximumWeight(long weight);
/** Set the time an element lives before being expired. */
CacheBinding<K, V> expireAfterWrite(long duration, TimeUnit durationUnits);
/** Populate the cache with items from the CacheLoader. */
CacheBinding<K, V> loader(Class<? extends CacheLoader<K, V>> clazz);
/** Algorithm to weigh an object with a method other than the unit weight 1. */
CacheBinding<K, V> weigher(Class<? extends Weigher<K, V>> clazz);
String name();
TypeLiteral<K> keyType();
TypeLiteral<V> valueType();
long maximumWeight();
@Nullable Long expireAfterWrite(TimeUnit unit);
@Nullable Weigher<K, V> weigher();
@Nullable CacheLoader<K, V> loader();
}

View File

@ -14,33 +14,41 @@
package com.google.gerrit.server.cache;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.cache.Weigher;
import com.google.gerrit.extensions.annotations.Exports;
import com.google.inject.AbstractModule;
import com.google.inject.Key;
import com.google.inject.Provider;
import com.google.inject.Scopes;
import com.google.inject.TypeLiteral;
import com.google.inject.internal.UniqueAnnotations;
import com.google.inject.name.Names;
import com.google.inject.util.Types;
import java.io.Serializable;
import java.lang.reflect.Type;
/**
* Miniature DSL to support binding {@link Cache} instances in Guice.
*/
public abstract class CacheModule extends AbstractModule {
private static final TypeLiteral<Cache<?, ?>> ANY_CACHE =
new TypeLiteral<Cache<?, ?>>() {};
/**
* Declare an unnamed in-memory cache.
* Declare a named in-memory cache.
*
* @param <K> type of key used to lookup entries.
* @param <V> type of value stored by the cache.
* @param type type literal for the cache, this literal will be used to match
* injection sites.
* @return binding to describe the cache. Caller must set at least the name on
* the returned binding.
* @return binding to describe the cache.
*/
protected <K, V> UnnamedCacheBinding<K, V> core(
final TypeLiteral<Cache<K, V>> type) {
return core(Key.get(type));
protected <K, V> CacheBinding<K, V> cache(
String name,
Class<K> keyType,
Class<V> valType) {
return cache(name, TypeLiteral.get(keyType), TypeLiteral.get(valType));
}
/**
@ -48,74 +56,127 @@ public abstract class CacheModule extends AbstractModule {
*
* @param <K> type of key used to lookup entries.
* @param <V> type of value stored by the cache.
* @param type type literal for the cache, this literal will be used to match
* injection sites. Injection sites are matched by this type literal
* and with {@code @Named} annotations.
* @return binding to describe the cache.
*/
protected <K, V> NamedCacheBinding<K, V> core(
final TypeLiteral<Cache<K, V>> type, final String name) {
return core(Key.get(type, Names.named(name))).name(name);
}
private <K, V> UnnamedCacheBinding<K, V> core(final Key<Cache<K, V>> key) {
final boolean disk = false;
final CacheProvider<K, V> b = new CacheProvider<K, V>(disk, this);
bind(key).toProvider(b).in(Scopes.SINGLETON);
return b;
protected <K, V> CacheBinding<K, V> cache(
String name,
Class<K> keyType,
TypeLiteral<V> valType) {
return cache(name, TypeLiteral.get(keyType), valType);
}
/**
* Declare an unnamed in-memory/on-disk cache.
* Declare a named in-memory cache.
*
* @param <K> type of key used to find entries, must be {@link Serializable}.
* @param <V> type of value stored by the cache, must be {@link Serializable}.
* @param type type literal for the cache, this literal will be used to match
* injection sites. Injection sites are matched by this type literal
* and with {@code @Named} annotations.
* @return binding to describe the cache. Caller must set at least the name on
* the returned binding.
* @param <K> type of key used to lookup entries.
* @param <V> type of value stored by the cache.
* @return binding to describe the cache.
*/
protected <K extends Serializable, V extends Serializable> UnnamedCacheBinding<K, V> disk(
final TypeLiteral<Cache<K, V>> type) {
return disk(Key.get(type));
protected <K, V> CacheBinding<K, V> cache(
String name,
TypeLiteral<K> keyType,
TypeLiteral<V> valType) {
Type type = Types.newParameterizedType(
Cache.class,
keyType.getType(), valType.getType());
@SuppressWarnings("unchecked")
Key<Cache<K, V>> key = (Key<Cache<K, V>>) Key.get(type, Names.named(name));
CacheProvider<K, V> m =
new CacheProvider<K, V>(this, name, keyType, valType);
bind(key).toProvider(m).in(Scopes.SINGLETON);
bind(ANY_CACHE).annotatedWith(Exports.named(name)).to(key);
return m.maximumWeight(1024);
}
<K,V> Provider<CacheLoader<K,V>> bindCacheLoader(
CacheProvider<K, V> m,
Class<? extends CacheLoader<K,V>> impl) {
Type type = Types.newParameterizedType(
Cache.class,
m.keyType().getType(), m.valueType().getType());
Type loadingType = Types.newParameterizedType(
LoadingCache.class,
m.keyType().getType(), m.valueType().getType());
Type loaderType = Types.newParameterizedType(
CacheLoader.class,
m.keyType().getType(), m.valueType().getType());
@SuppressWarnings("unchecked")
Key<LoadingCache<K, V>> key =
(Key<LoadingCache<K, V>>) Key.get(type, Names.named(m.name));
@SuppressWarnings("unchecked")
Key<LoadingCache<K, V>> loadingKey =
(Key<LoadingCache<K, V>>) Key.get(loadingType, Names.named(m.name));
@SuppressWarnings("unchecked")
Key<CacheLoader<K, V>> loaderKey =
(Key<CacheLoader<K, V>>) Key.get(loaderType, Names.named(m.name));
bind(loaderKey).to(impl).in(Scopes.SINGLETON);
bind(loadingKey).to(key);
return getProvider(loaderKey);
}
<K,V> Provider<Weigher<K,V>> bindWeigher(
CacheProvider<K, V> m,
Class<? extends Weigher<K,V>> impl) {
Type weigherType = Types.newParameterizedType(
Weigher.class,
m.keyType().getType(), m.valueType().getType());
@SuppressWarnings("unchecked")
Key<Weigher<K, V>> key =
(Key<Weigher<K, V>>) Key.get(weigherType, Names.named(m.name));
bind(key).to(impl).in(Scopes.SINGLETON);
return getProvider(key);
}
/**
* Declare a named in-memory/on-disk cache.
*
* @param <K> type of key used to find entries, must be {@link Serializable}.
* @param <V> type of value stored by the cache, must be {@link Serializable}.
* @param type type literal for the cache, this literal will be used to match
* injection sites. Injection sites are matched by this type literal
* and with {@code @Named} annotations.
* @param <K> type of key used to lookup entries.
* @param <V> type of value stored by the cache.
* @return binding to describe the cache.
*/
protected <K extends Serializable, V extends Serializable> NamedCacheBinding<K, V> disk(
final TypeLiteral<Cache<K, V>> type, final String name) {
return disk(Key.get(type, Names.named(name))).name(name);
protected <K extends Serializable, V extends Serializable> CacheBinding<K, V> persist(
String name,
Class<K> keyType,
Class<V> valType) {
return persist(name, TypeLiteral.get(keyType), TypeLiteral.get(valType));
}
private <K, V> UnnamedCacheBinding<K, V> disk(final Key<Cache<K, V>> key) {
final boolean disk = true;
final CacheProvider<K, V> b = new CacheProvider<K, V>(disk, this);
bind(key).toProvider(b).in(Scopes.SINGLETON);
return b;
/**
* Declare a named in-memory/on-disk cache.
*
* @param <K> type of key used to lookup entries.
* @param <V> type of value stored by the cache.
* @return binding to describe the cache.
*/
protected <K extends Serializable, V extends Serializable> CacheBinding<K, V> persist(
String name,
Class<K> keyType,
TypeLiteral<V> valType) {
return persist(name, TypeLiteral.get(keyType), valType);
}
<K, V> Provider<EntryCreator<K, V>> getEntryCreator(CacheProvider<K, V> cp,
Class<? extends EntryCreator<K, V>> type) {
Key<EntryCreator<K, V>> key = newKey();
bind(key).to(type).in(Scopes.SINGLETON);
return getProvider(key);
}
@SuppressWarnings("unchecked")
private static <K, V> Key<EntryCreator<K, V>> newKey() {
return (Key<EntryCreator<K, V>>) newKeyImpl();
}
private static Key<?> newKeyImpl() {
return Key.get(EntryCreator.class, UniqueAnnotations.create());
/**
* Declare a named in-memory/on-disk cache.
*
* @param <K> type of key used to lookup entries.
* @param <V> type of value stored by the cache.
* @return binding to describe the cache.
*/
protected <K extends Serializable, V extends Serializable> CacheBinding<K, V> persist(
String name,
TypeLiteral<K> keyType,
TypeLiteral<V> valType) {
return ((CacheProvider<K, V>) cache(name, keyType, valType))
.persist(true);
}
}

View File

@ -1,4 +1,4 @@
// Copyright (C) 2009 The Android Open Source Project
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@ -14,130 +14,156 @@
package com.google.gerrit.server.cache;
import static com.google.gerrit.server.cache.EvictionPolicy.LFU;
import static java.util.concurrent.TimeUnit.DAYS;
import static java.util.concurrent.TimeUnit.SECONDS;
import com.google.common.base.Preconditions;
import com.google.common.base.Strings;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.Weigher;
import com.google.gerrit.extensions.annotations.PluginName;
import com.google.inject.Inject;
import com.google.inject.Provider;
import com.google.inject.ProvisionException;
import com.google.inject.TypeLiteral;
import java.util.concurrent.TimeUnit;
public final class CacheProvider<K, V> implements Provider<Cache<K, V>>,
NamedCacheBinding<K, V>, UnnamedCacheBinding<K, V> {
import javax.annotation.Nullable;
class CacheProvider<K, V>
implements Provider<Cache<K, V>>,
CacheBinding<K, V> {
private final CacheModule module;
private final boolean disk;
private int memoryLimit;
private int diskLimit;
private long maxAge;
private EvictionPolicy evictionPolicy;
private String cacheName;
private ProxyCache<K, V> cache;
private Provider<EntryCreator<K, V>> entryCreator;
final String name;
private final TypeLiteral<K> keyType;
private final TypeLiteral<V> valType;
private boolean persist;
private long maximumWeight;
private Long expireAfterWrite;
private Provider<CacheLoader<K, V>> loader;
private Provider<Weigher<K, V>> weigher;
CacheProvider(final boolean disk, CacheModule module) {
this.disk = disk;
private String plugin;
private MemoryCacheFactory memoryCacheFactory;
private PersistentCacheFactory persistentCacheFactory;
private boolean frozen;
CacheProvider(CacheModule module,
String name,
TypeLiteral<K> keyType,
TypeLiteral<V> valType) {
this.module = module;
this.name = name;
this.keyType = keyType;
this.valType = valType;
}
memoryLimit(1024);
maxAge(90, DAYS);
evictionPolicy(LFU);
if (disk) {
diskLimit(16384);
}
@Inject(optional = true)
void setPluginName(@PluginName String pluginName) {
this.plugin = pluginName;
}
@Inject
void setCachePool(final CachePool pool) {
this.cache = pool.register(this);
void setMemoryCacheFactory(MemoryCacheFactory factory) {
this.memoryCacheFactory = factory;
}
public void bind(Cache<K, V> impl) {
if (cache == null) {
throw new ProvisionException("Cache was never registered");
}
cache.bind(impl);
@Inject(optional = true)
void setPersistentCacheFactory(@Nullable PersistentCacheFactory factory) {
this.persistentCacheFactory = factory;
}
public EntryCreator<K, V> getEntryCreator() {
return entryCreator != null ? entryCreator.get() : null;
}
public String getName() {
if (cacheName == null) {
throw new ProvisionException("Cache has no name");
}
return cacheName;
}
public boolean disk() {
return disk;
}
public int memoryLimit() {
return memoryLimit;
}
public int diskLimit() {
return diskLimit;
}
public long maxAge() {
return maxAge;
}
public EvictionPolicy evictionPolicy() {
return evictionPolicy;
}
public NamedCacheBinding<K, V> name(final String name) {
if (cacheName != null) {
throw new IllegalStateException("Cache name already set");
}
cacheName = name;
return this;
}
public NamedCacheBinding<K, V> memoryLimit(final int objects) {
memoryLimit = objects;
return this;
}
public NamedCacheBinding<K, V> diskLimit(final int objects) {
if (!disk) {
// TODO This should really be a compile time type error, but I'm
// too lazy to create the mess of permutations required to setup
// type safe returns for bindings in our little DSL.
//
throw new IllegalStateException("Cache is not disk based");
}
diskLimit = objects;
return this;
}
public NamedCacheBinding<K, V> maxAge(final long duration, final TimeUnit unit) {
maxAge = SECONDS.convert(duration, unit);
CacheBinding<K, V> persist(boolean p) {
Preconditions.checkState(!frozen, "binding frozen, cannot be modified");
persist = p;
return this;
}
@Override
public NamedCacheBinding<K, V> evictionPolicy(final EvictionPolicy policy) {
evictionPolicy = policy;
public CacheBinding<K, V> maximumWeight(long weight) {
Preconditions.checkState(!frozen, "binding frozen, cannot be modified");
maximumWeight = weight;
return this;
}
public NamedCacheBinding<K, V> populateWith(
Class<? extends EntryCreator<K, V>> creator) {
entryCreator = module.getEntryCreator(this, creator);
@Override
public CacheBinding<K, V> expireAfterWrite(long duration, TimeUnit unit) {
Preconditions.checkState(!frozen, "binding frozen, cannot be modified");
expireAfterWrite = SECONDS.convert(duration, unit);
return this;
}
public Cache<K, V> get() {
if (cache == null) {
throw new ProvisionException("Cache \"" + cacheName + "\" not available");
@Override
public CacheBinding<K, V> loader(Class<? extends CacheLoader<K, V>> impl) {
Preconditions.checkState(!frozen, "binding frozen, cannot be modified");
loader = module.bindCacheLoader(this, impl);
return this;
}
@Override
public CacheBinding<K, V> weigher(Class<? extends Weigher<K, V>> impl) {
Preconditions.checkState(!frozen, "binding frozen, cannot be modified");
weigher = module.bindWeigher(this, impl);
return this;
}
@Override
public String name() {
if (!Strings.isNullOrEmpty(plugin)) {
return plugin + "." + name;
}
return name;
}
@Override
public TypeLiteral<K> keyType() {
return keyType;
}
@Override
public TypeLiteral<V> valueType() {
return valType;
}
@Override
public long maximumWeight() {
return maximumWeight;
}
@Override
@Nullable
public Long expireAfterWrite(TimeUnit unit) {
return expireAfterWrite != null
? unit.convert(expireAfterWrite, SECONDS)
: null;
}
@Override
@Nullable
public Weigher<K, V> weigher() {
return weigher != null ? weigher.get() : null;
}
@Override
@Nullable
public CacheLoader<K, V> loader() {
return loader != null ? loader.get() : null;
}
@Override
public Cache<K, V> get() {
frozen = true;
if (loader != null) {
CacheLoader<K, V> ldr = loader.get();
if (persist && persistentCacheFactory != null) {
return persistentCacheFactory.build(this, ldr);
}
return memoryCacheFactory.build(this, ldr);
} else if (persist && persistentCacheFactory != null) {
return persistentCacheFactory.build(this);
} else {
return memoryCacheFactory.build(this);
}
return cache;
}
}

View File

@ -1,48 +0,0 @@
// Copyright (C) 2011 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.package com.google.gerrit.server.git;
package com.google.gerrit.server.cache;
import java.util.concurrent.ConcurrentHashMap;
/**
* An infinitely sized cache backed by java.util.ConcurrentHashMap.
* <p>
* This cache type is only suitable for unit tests, as it has no upper limit on
* number of items held in the cache. No upper limit can result in memory leaks
* in production servers.
*/
public class ConcurrentHashMapCache<K, V> implements Cache<K, V> {
private final ConcurrentHashMap<K, V> map = new ConcurrentHashMap<K, V>();
@Override
public V get(K key) {
return map.get(key);
}
@Override
public void put(K key, V value) {
map.put(key, value);
}
@Override
public void remove(K key) {
map.remove(key);
}
@Override
public void removeAll() {
map.clear();
}
}

View File

@ -1,40 +0,0 @@
// Copyright (C) 2009 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache;
/**
* Creates a cache entry on demand when its not found.
*
* @param <K> type of the cache's key.
* @param <V> type of the cache's value element.
*/
public abstract class EntryCreator<K, V> {
/**
* Invoked on a cache miss, to compute the cache entry.
*
* @param key entry whose content needs to be obtained.
* @return new cache content. The caller will automatically put this object
* into the cache.
* @throws Exception the cache content cannot be computed. No entry will be
* stored in the cache, and {@link #missing(Object)} will be invoked
* instead. Future requests for the same key will retry this method.
*/
public abstract V createEntry(K key) throws Exception;
/** Invoked when {@link #createEntry(Object)} fails, by default return null. */
public V missing(K key) {
return null;
}
}

View File

@ -1,4 +1,4 @@
// Copyright (C) 2009 The Android Open Source Project
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@ -14,11 +14,14 @@
package com.google.gerrit.server.cache;
/** How entries should be evicted from the cache. */
public enum EvictionPolicy {
/** Least recently used is evicted first. */
LRU,
import com.google.common.cache.Cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
/** Least frequently used is evicted first. */
LFU;
public interface MemoryCacheFactory {
<K, V> Cache<K, V> build(CacheBinding<K, V> def);
<K, V> LoadingCache<K, V> build(
CacheBinding<K, V> def,
CacheLoader<K, V> loader);
}

View File

@ -1,35 +0,0 @@
// Copyright (C) 2009 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache;
import java.util.concurrent.TimeUnit;
/** Configure a cache declared within a {@link CacheModule} instance. */
public interface NamedCacheBinding<K, V> {
/** Set the number of objects to cache in memory. */
public NamedCacheBinding<K, V> memoryLimit(int objects);
/** Set the number of objects to cache in memory. */
public NamedCacheBinding<K, V> diskLimit(int objects);
/** Set the time an element lives before being expired. */
public NamedCacheBinding<K, V> maxAge(long duration, TimeUnit durationUnits);
/** Set the eviction policy for elements when the cache is full. */
public NamedCacheBinding<K, V> evictionPolicy(EvictionPolicy policy);
/** Populate the cache with items from the EntryCreator. */
public NamedCacheBinding<K, V> populateWith(Class<? extends EntryCreator<K, V>> creator);
}

View File

@ -1,4 +1,4 @@
// Copyright (C) 2010 The Android Open Source Project
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@ -14,6 +14,14 @@
package com.google.gerrit.server.cache;
public interface CachePool {
public <K, V> ProxyCache<K, V> register(CacheProvider<K, V> provider);
import com.google.common.cache.Cache;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
public interface PersistentCacheFactory {
<K, V> Cache<K, V> build(CacheBinding<K, V> def);
<K, V> LoadingCache<K, V> build(
CacheBinding<K, V> def,
CacheLoader<K, V> loader);
}

View File

@ -1,40 +0,0 @@
// Copyright (C) 2010 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.cache;
/** Proxy around a cache which has not yet been created. */
public final class ProxyCache<K, V> implements Cache<K, V> {
private volatile Cache<K, V> self;
public void bind(Cache<K, V> self) {
this.self = self;
}
public V get(K key) {
return self.get(key);
}
public void put(K key, V value) {
self.put(key, value);
}
public void remove(K key) {
self.remove(key);
}
public void removeAll() {
self.removeAll();
}
}

View File

@ -16,9 +16,11 @@ package com.google.gerrit.server.config;
import static com.google.inject.Scopes.SINGLETON;
import com.google.common.cache.Cache;
import com.google.gerrit.common.data.ApprovalTypes;
import com.google.gerrit.extensions.events.GitReferenceUpdatedListener;
import com.google.gerrit.extensions.events.NewProjectCreatedListener;
import com.google.gerrit.extensions.registration.DynamicMap;
import com.google.gerrit.extensions.registration.DynamicSet;
import com.google.gerrit.reviewdb.client.AuthType;
import com.google.gerrit.rules.PrologModule;
@ -68,7 +70,7 @@ import com.google.gerrit.server.util.IdGenerator;
import com.google.gerrit.server.util.ThreadLocalRequestContext;
import com.google.gerrit.server.workflow.FunctionState;
import com.google.inject.Inject;
import com.google.inject.servlet.RequestScoped;
import com.google.inject.TypeLiteral;
import org.apache.velocity.runtime.RuntimeInstance;
import org.eclipse.jgit.lib.Config;
@ -156,6 +158,7 @@ public class GerritGlobalModule extends FactoryModule {
factory(FunctionState.Factory.class);
bind(GitReferenceUpdated.class);
DynamicMap.mapOf(binder(), new TypeLiteral<Cache<?, ?>>() {});
DynamicSet.setOf(binder(), GitReferenceUpdatedListener.class);
DynamicSet.setOf(binder(), NewProjectCreatedListener.class);

View File

@ -32,6 +32,7 @@ import com.google.gerrit.server.config.CanonicalWebUrl;
import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListCache;
import com.google.gerrit.server.patch.PatchListEntry;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.gwtorm.server.OrmException;
import com.google.gwtorm.server.SchemaFactory;
import com.google.inject.Inject;
@ -232,16 +233,19 @@ public class EventFactory {
public void addPatchSetFileNames(PatchSetAttribute patchSetAttribute,
Change change, PatchSet patchSet) {
PatchList patchList = patchListCache.get(change, patchSet);
for (PatchListEntry patch : patchList.getPatches()) {
if (patchSetAttribute.files == null) {
patchSetAttribute.files = new ArrayList<PatchAttribute>();
}
try {
PatchList patchList = patchListCache.get(change, patchSet);
for (PatchListEntry patch : patchList.getPatches()) {
if (patchSetAttribute.files == null) {
patchSetAttribute.files = new ArrayList<PatchAttribute>();
}
PatchAttribute p = new PatchAttribute();
p.file = patch.getNewName();
p.type = patch.getChangeType();
patchSetAttribute.files.add(p);
PatchAttribute p = new PatchAttribute();
p.file = patch.getNewName();
p.type = patch.getChangeType();
patchSetAttribute.files.add(p);
}
} catch (PatchListNotAvailableException e) {
}
}

View File

@ -17,10 +17,8 @@ package com.google.gerrit.server.git;
import static com.google.gerrit.server.git.GitRepositoryManager.REF_REJECT_COMMITS;
import com.google.gerrit.common.errors.PermissionDeniedException;
import com.google.gerrit.reviewdb.client.Account;
import com.google.gerrit.server.CurrentUser;
import com.google.gerrit.server.GerritPersonIdent;
import com.google.gerrit.server.account.AccountCache;
import com.google.gerrit.server.IdentifiedUser;
import com.google.gerrit.server.project.ProjectControl;
import com.google.inject.Inject;
import com.google.inject.Provider;
@ -44,7 +42,9 @@ import org.eclipse.jgit.revwalk.RevCommit;
import org.eclipse.jgit.revwalk.RevWalk;
import java.io.IOException;
import java.util.Date;
import java.util.List;
import java.util.TimeZone;
public class BanCommit {
@ -55,25 +55,23 @@ public class BanCommit {
BanCommit create();
}
private final Provider<CurrentUser> currentUser;
private final Provider<IdentifiedUser> currentUser;
private final GitRepositoryManager repoManager;
private final AccountCache accountCache;
private final PersonIdent gerritIdent;
@Inject
BanCommit(final Provider<CurrentUser> currentUser,
final GitRepositoryManager repoManager, final AccountCache accountCache,
BanCommit(final Provider<IdentifiedUser> currentUser,
final GitRepositoryManager repoManager,
@GerritPersonIdent final PersonIdent gerritIdent) {
this.currentUser = currentUser;
this.repoManager = repoManager;
this.accountCache = accountCache;
this.gerritIdent = gerritIdent;
}
public BanCommitResult ban(final ProjectControl projectControl,
final List<ObjectId> commitsToBan, final String reason)
throws PermissionDeniedException, IOException,
IncompleteUserInfoException, InterruptedException, MergeException {
InterruptedException, MergeException {
if (!projectControl.isOwner()) {
throw new PermissionDeniedException(
"No project owner: not permitted to ban commits");
@ -148,16 +146,10 @@ public class BanCommit {
return result;
}
private PersonIdent createPersonIdent() throws IncompleteUserInfoException {
final String userName = currentUser.get().getUserName();
final Account account = accountCache.getByUsername(userName).getAccount();
if (account.getFullName() == null) {
throw new IncompleteUserInfoException(userName, "full name");
}
if (account.getPreferredEmail() == null) {
throw new IncompleteUserInfoException(userName, "preferred email");
}
return new PersonIdent(account.getFullName(), account.getPreferredEmail());
private PersonIdent createPersonIdent() {
Date now = new Date();
TimeZone tz = gerritIdent.getTimeZone();
return currentUser.get().newCommitterIdent(now, tz);
}
private static ObjectId commit(final NoteMap noteMap,

View File

@ -14,13 +14,12 @@
package com.google.gerrit.server.git;
import com.google.common.cache.Cache;
import com.google.gerrit.reviewdb.client.Project;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.inject.Inject;
import com.google.inject.Module;
import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import org.eclipse.jgit.lib.ObjectId;
@ -38,19 +37,17 @@ public class TagCache {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<EntryKey, EntryVal>> type =
new TypeLiteral<Cache<EntryKey, EntryVal>>() {};
disk(type, CACHE_NAME);
persist(CACHE_NAME, String.class, EntryVal.class);
bind(TagCache.class);
}
};
}
private final Cache<EntryKey, EntryVal> cache;
private final Cache<String, EntryVal> cache;
private final Object createLock = new Object();
@Inject
TagCache(@Named(CACHE_NAME) Cache<EntryKey, EntryVal> cache) {
TagCache(@Named(CACHE_NAME) Cache<String, EntryVal> cache) {
this.cache = cache;
}
@ -74,7 +71,7 @@ public class TagCache {
// never fail with an exception. Some of these references can be null
// (e.g. not all projects are cached, or the cache is not current).
//
EntryVal val = cache.get(new EntryKey(name));
EntryVal val = cache.getIfPresent(name.get());
if (val != null) {
TagSetHolder holder = val.holder;
if (holder != null) {
@ -87,54 +84,22 @@ public class TagCache {
}
TagSetHolder get(Project.NameKey name) {
EntryKey key = new EntryKey(name);
EntryVal val = cache.get(key);
EntryVal val = cache.getIfPresent(name.get());
if (val == null) {
synchronized (createLock) {
val = cache.get(key);
val = cache.getIfPresent(name.get());
if (val == null) {
val = new EntryVal();
val.holder = new TagSetHolder(name);
cache.put(key, val);
cache.put(name.get(), val);
}
}
}
return val.holder;
}
static class EntryKey implements Serializable {
static final long serialVersionUID = 1L;
private transient String name;
EntryKey(Project.NameKey name) {
this.name = name.get();
}
@Override
public int hashCode() {
return name.hashCode();
}
@Override
public boolean equals(Object o) {
if (o instanceof EntryKey) {
return name.equals(((EntryKey) o).name);
}
return false;
}
private void readObject(ObjectInputStream in) throws IOException {
name = in.readUTF();
}
private void writeObject(ObjectOutputStream out) throws IOException {
out.writeUTF(name);
}
}
static class EntryVal implements Serializable {
static final long serialVersionUID = EntryKey.serialVersionUID;
static final long serialVersionUID = 1L;
transient TagSetHolder holder;

View File

@ -58,6 +58,15 @@ class TagSet {
return tags.get(id);
}
int weigh() {
int refCnt = refs.size();
int bits = refCnt / 8;
int size = 16 + 3*8 + 16 + 16;
size += (16 + 16 + 8+ 4 + 36 + 120) * refCnt;
size += (16 + 36 + 16 + bits) * tags.size();
return size;
}
void updateFastForward(String refName, ObjectId oldValue,
ObjectId newValue) {
CachedRef ref = refs.get(refName);

View File

@ -35,6 +35,7 @@ import com.google.gerrit.server.IdentifiedUser;
import com.google.gerrit.server.git.NotifyConfig;
import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListEntry;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.gerrit.server.patch.PatchSetInfoNotAvailableException;
import com.google.gerrit.server.project.ProjectState;
import com.google.gerrit.server.query.Predicate;
@ -270,13 +271,12 @@ public abstract class ChangeEmail extends OutgoingEmail {
}
}
/** Get the patch list corresponding to this patch set. */
protected PatchList getPatchList() {
protected PatchList getPatchList() throws PatchListNotAvailableException {
if (patchSet != null) {
return args.patchListCache.get(change, patchSet);
}
return null;
throw new PatchListNotAvailableException("no patchSet specified");
}
/** Get the project entity the change is in; null if its been deleted. */

View File

@ -21,6 +21,7 @@ import com.google.gerrit.reviewdb.client.AccountProjectWatch.NotifyType;
import com.google.gerrit.server.config.AnonymousCowardName;
import com.google.gerrit.server.patch.PatchFile;
import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.inject.Inject;
import com.google.inject.assistedinject.Assisted;
@ -81,7 +82,14 @@ public class CommentSender extends ReplyToChangeSender {
final Repository repo = getRepository();
try {
final PatchList patchList = repo != null ? getPatchList() : null;
PatchList patchList = null;
if (repo != null) {
try {
patchList = getPatchList();
} catch (PatchListNotAvailableException e) {
patchList = null;
}
}
Patch.Key currentFileKey = null;
PatchFile currentFileData = null;

View File

@ -114,6 +114,9 @@ public class IntraLineDiffKey implements Serializable {
public String toString() {
StringBuilder n = new StringBuilder();
n.append("IntraLineDiffKey[");
if (projectKey != null) {
n.append(projectKey.get()).append(" ");
}
n.append(aId.name());
n.append("..");
n.append(bId.name());

View File

@ -15,7 +15,7 @@
package com.google.gerrit.server.patch;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.common.cache.CacheLoader;
import com.google.gerrit.server.config.ConfigUtil;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.inject.Inject;
@ -35,9 +35,8 @@ import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.regex.Pattern;
class IntraLineLoader extends EntryCreator<IntraLineDiffKey, IntraLineDiff> {
private static final Logger log = LoggerFactory
.getLogger(IntraLineLoader.class);
class IntraLineLoader extends CacheLoader<IntraLineDiffKey, IntraLineDiff> {
static final Logger log = LoggerFactory.getLogger(IntraLineLoader.class);
private static final Pattern BLANK_LINE_RE = Pattern
.compile("^[ \\t]*(|[{}]|/\\*\\*?|\\*)[ \\t]*$");
@ -62,7 +61,7 @@ class IntraLineLoader extends EntryCreator<IntraLineDiffKey, IntraLineDiff> {
}
@Override
public IntraLineDiff createEntry(IntraLineDiffKey key) throws Exception {
public IntraLineDiff load(IntraLineDiffKey key) throws Exception {
Worker w = workerPool.poll();
if (w == null) {
w = new Worker();
@ -119,7 +118,7 @@ class IntraLineLoader extends EntryCreator<IntraLineDiffKey, IntraLineDiff> {
throws Exception {
if (!input.offer(new Input(key))) {
log.error("Cannot enqueue task to thread " + thread.getName());
return null;
return Result.TIMEOUT;
}
Result r = result.poll(timeoutMillis, TimeUnit.MILLISECONDS);

View File

@ -0,0 +1,28 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.patch;
import com.google.common.cache.Weigher;
/** Approximates memory usage for IntralineDiff in bytes of memory used. */
public class IntraLineWeigher implements
Weigher<IntraLineDiffKey, IntraLineDiff> {
@Override
public int weigh(IntraLineDiffKey key, IntraLineDiff value) {
return 16 + 8*8 + 2*36 // Size of IntraLineDiffKey, 64 bit JVM
+ 16 + 2*8 + 16+8+4+20 // Size of IntraLineDiff, 64 bit JVM
+ (8 + 16 + 4*4) * value.getEdits().size();
}
}

View File

@ -19,9 +19,10 @@ import com.google.gerrit.reviewdb.client.PatchSet;
/** Provides a cached list of {@link PatchListEntry}. */
public interface PatchListCache {
public PatchList get(PatchListKey key);
public PatchList get(PatchListKey key) throws PatchListNotAvailableException;
public PatchList get(Change change, PatchSet patchSet);
public PatchList get(Change change, PatchSet patchSet)
throws PatchListNotAvailableException;
public IntraLineDiff getIntraLineDiff(IntraLineDiffKey key);
}

View File

@ -15,24 +15,23 @@
package com.google.gerrit.server.patch;
import com.google.common.cache.LoadingCache;
import com.google.gerrit.reviewdb.client.AccountDiffPreference.Whitespace;
import com.google.gerrit.reviewdb.client.Change;
import com.google.gerrit.reviewdb.client.PatchSet;
import com.google.gerrit.reviewdb.client.Project;
import com.google.gerrit.reviewdb.client.AccountDiffPreference.Whitespace;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EvictionPolicy;
import com.google.gerrit.server.config.GerritServerConfig;
import com.google.inject.Inject;
import com.google.inject.Module;
import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import org.eclipse.jgit.lib.Config;
import org.eclipse.jgit.lib.ObjectId;
import java.util.concurrent.ExecutionException;
/** Provides a cached list of {@link PatchListEntry}. */
@Singleton
public class PatchListCacheImpl implements PatchListCache {
@ -43,21 +42,15 @@ public class PatchListCacheImpl implements PatchListCache {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<PatchListKey, PatchList>> fileType =
new TypeLiteral<Cache<PatchListKey, PatchList>>() {};
disk(fileType, FILE_NAME) //
.memoryLimit(128) // very large items, cache only a few
.evictionPolicy(EvictionPolicy.LRU) // prefer most recent
.populateWith(PatchListLoader.class) //
;
persist(FILE_NAME, PatchListKey.class, PatchList.class)
.maximumWeight(10 << 20)
.loader(PatchListLoader.class)
.weigher(PatchListWeigher.class);
final TypeLiteral<Cache<IntraLineDiffKey, IntraLineDiff>> intraType =
new TypeLiteral<Cache<IntraLineDiffKey, IntraLineDiff>>() {};
disk(intraType, INTRA_NAME) //
.memoryLimit(128) // very large items, cache only a few
.evictionPolicy(EvictionPolicy.LRU) // prefer most recent
.populateWith(IntraLineLoader.class) //
;
persist(INTRA_NAME, IntraLineDiffKey.class, IntraLineDiff.class)
.maximumWeight(10 << 20)
.loader(IntraLineLoader.class)
.weigher(IntraLineWeigher.class);
bind(PatchListCacheImpl.class);
bind(PatchListCache.class).to(PatchListCacheImpl.class);
@ -65,14 +58,14 @@ public class PatchListCacheImpl implements PatchListCache {
};
}
private final Cache<PatchListKey, PatchList> fileCache;
private final Cache<IntraLineDiffKey, IntraLineDiff> intraCache;
private final LoadingCache<PatchListKey, PatchList> fileCache;
private final LoadingCache<IntraLineDiffKey, IntraLineDiff> intraCache;
private final boolean computeIntraline;
@Inject
PatchListCacheImpl(
@Named(FILE_NAME) final Cache<PatchListKey, PatchList> fileCache,
@Named(INTRA_NAME) final Cache<IntraLineDiffKey, IntraLineDiff> intraCache,
@Named(FILE_NAME) LoadingCache<PatchListKey, PatchList> fileCache,
@Named(INTRA_NAME) LoadingCache<IntraLineDiffKey, IntraLineDiff> intraCache,
@GerritServerConfig Config cfg) {
this.fileCache = fileCache;
this.intraCache = intraCache;
@ -82,11 +75,19 @@ public class PatchListCacheImpl implements PatchListCache {
cfg.getBoolean("cache", "diff", "intraline", true));
}
public PatchList get(final PatchListKey key) {
return fileCache.get(key);
@Override
public PatchList get(PatchListKey key) throws PatchListNotAvailableException {
try {
return fileCache.get(key);
} catch (ExecutionException e) {
PatchListLoader.log.warn("Error computing " + key, e);
throw new PatchListNotAvailableException(e.getCause());
}
}
public PatchList get(final Change change, final PatchSet patchSet) {
@Override
public PatchList get(final Change change, final PatchSet patchSet)
throws PatchListNotAvailableException {
final Project.NameKey projectKey = change.getProject();
final ObjectId a = null;
final ObjectId b = ObjectId.fromString(patchSet.getRevision().get());
@ -97,11 +98,12 @@ public class PatchListCacheImpl implements PatchListCache {
@Override
public IntraLineDiff getIntraLineDiff(IntraLineDiffKey key) {
if (computeIntraline) {
IntraLineDiff d = intraCache.get(key);
if (d == null) {
d = new IntraLineDiff(IntraLineDiff.Status.ERROR);
try {
return intraCache.get(key);
} catch (ExecutionException e) {
IntraLineLoader.log.warn("Error computing " + key, e);
return new IntraLineDiff(IntraLineDiff.Status.ERROR);
}
return d;
} else {
return new IntraLineDiff(IntraLineDiff.Status.DISABLED);
}

View File

@ -122,6 +122,22 @@ public class PatchListEntry {
this.deletions = deletions;
}
int weigh() {
int size = 16 + 6*8 + 2*4 + 20 + 16+8+4+20;
size += stringSize(oldName);
size += stringSize(newName);
size += header.length;
size += (8 + 16 + 4*4) * edits.size();
return size;
}
private static int stringSize(String str) {
if (str != null) {
return 16 + 3*4 + 16 + str.length() * 2;
}
return 0;
}
public ChangeType getChangeType() {
return changeType;
}

View File

@ -15,9 +15,9 @@
package com.google.gerrit.server.patch;
import com.google.gerrit.reviewdb.client.Patch;
import com.google.common.cache.CacheLoader;
import com.google.gerrit.reviewdb.client.AccountDiffPreference.Whitespace;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gerrit.reviewdb.client.Patch;
import com.google.gerrit.server.git.GitRepositoryManager;
import com.google.inject.Inject;
@ -54,6 +54,8 @@ import org.eclipse.jgit.treewalk.TreeWalk;
import org.eclipse.jgit.treewalk.filter.TreeFilter;
import org.eclipse.jgit.util.TemporaryBuffer;
import org.eclipse.jgit.util.io.DisabledOutputStream;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.io.InputStream;
@ -62,7 +64,9 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
class PatchListLoader extends EntryCreator<PatchListKey, PatchList> {
class PatchListLoader extends CacheLoader<PatchListKey, PatchList> {
static final Logger log = LoggerFactory.getLogger(PatchListLoader.class);
private final GitRepositoryManager repoManager;
@Inject
@ -71,7 +75,7 @@ class PatchListLoader extends EntryCreator<PatchListKey, PatchList> {
}
@Override
public PatchList createEntry(final PatchListKey key) throws Exception {
public PatchList load(final PatchListKey key) throws Exception {
final Repository repo = repoManager.openRepository(key.projectKey);
try {
return readPatchList(key, repo);

View File

@ -12,12 +12,16 @@
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.git;
package com.google.gerrit.server.patch;
public class IncompleteUserInfoException extends Exception {
public class PatchListNotAvailableException extends Exception {
private static final long serialVersionUID = 1L;
public IncompleteUserInfoException(final String userName, final String missingInfo) {
super("For the user \"" + userName + "\" " + missingInfo + " is not set.");
public PatchListNotAvailableException(String message) {
super(message);
}
public PatchListNotAvailableException(Throwable cause) {
super(cause);
}
}

View File

@ -0,0 +1,30 @@
// Copyright (C) 2012 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.google.gerrit.server.patch;
import com.google.common.cache.Weigher;
/** Approximates memory usage for PatchList in bytes of memory used. */
public class PatchListWeigher implements Weigher<PatchListKey, PatchList> {
@Override
public int weigh(PatchListKey key, PatchList value) {
int size = 16 + 4*8 + 2*36 // Size of PatchListKey, 64 bit JVM
+ 16 + 3*8 + 3*4 + 20; // Size of PatchList, 64 bit JVM
for (PatchListEntry e : value.getPatches()) {
size += e.weigh();
}
return size;
}
}

View File

@ -14,10 +14,11 @@
package com.google.gerrit.server.project;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.Sets;
import com.google.gerrit.reviewdb.client.Project;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gerrit.server.config.AllProjectsName;
import com.google.gerrit.server.git.GitRepositoryManager;
import com.google.gerrit.server.git.ProjectConfig;
@ -27,20 +28,24 @@ import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import org.eclipse.jgit.errors.RepositoryNotFoundException;
import org.eclipse.jgit.lib.Repository;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Collections;
import java.util.Iterator;
import java.util.NoSuchElementException;
import java.util.SortedSet;
import java.util.TreeSet;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.locks.Lock;
import java.util.concurrent.locks.ReentrantLock;
/** Cache of project information, including access rights. */
@Singleton
public class ProjectCacheImpl implements ProjectCache {
private static final Logger log = LoggerFactory
.getLogger(ProjectCacheImpl.class);
private static final String CACHE_NAME = "projects";
private static final String CACHE_LIST = "project_list";
@ -48,13 +53,14 @@ public class ProjectCacheImpl implements ProjectCache {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<Project.NameKey, ProjectState>> nameType =
new TypeLiteral<Cache<Project.NameKey, ProjectState>>() {};
core(nameType, CACHE_NAME).populateWith(Loader.class);
cache(CACHE_NAME, String.class, ProjectState.class)
.loader(Loader.class);
final TypeLiteral<Cache<ListKey, SortedSet<Project.NameKey>>> listType =
new TypeLiteral<Cache<ListKey, SortedSet<Project.NameKey>>>() {};
core(listType, CACHE_LIST).populateWith(Lister.class);
cache(CACHE_LIST,
ListKey.class,
new TypeLiteral<SortedSet<Project.NameKey>>() {})
.maximumWeight(1)
.loader(Lister.class);
bind(ProjectCacheImpl.class);
bind(ProjectCache.class).to(ProjectCacheImpl.class);
@ -63,16 +69,16 @@ public class ProjectCacheImpl implements ProjectCache {
}
private final AllProjectsName allProjectsName;
private final Cache<Project.NameKey, ProjectState> byName;
private final Cache<ListKey,SortedSet<Project.NameKey>> list;
private final LoadingCache<String, ProjectState> byName;
private final LoadingCache<ListKey, SortedSet<Project.NameKey>> list;
private final Lock listLock;
private final ProjectCacheClock clock;
@Inject
ProjectCacheImpl(
final AllProjectsName allProjectsName,
@Named(CACHE_NAME) final Cache<Project.NameKey, ProjectState> byName,
@Named(CACHE_LIST) final Cache<ListKey, SortedSet<Project.NameKey>> list,
@Named(CACHE_NAME) LoadingCache<String, ProjectState> byName,
@Named(CACHE_LIST) LoadingCache<ListKey, SortedSet<Project.NameKey>> list,
ProjectCacheClock clock) {
this.allProjectsName = allProjectsName;
this.byName = byName;
@ -99,18 +105,26 @@ public class ProjectCacheImpl implements ProjectCache {
* @return the cached data; null if no such project exists.
*/
public ProjectState get(final Project.NameKey projectName) {
ProjectState state = byName.get(projectName);
if (state != null && state.needsRefresh(clock.read())) {
byName.remove(projectName);
state = byName.get(projectName);
if (projectName == null) {
return null;
}
try {
ProjectState state = byName.get(projectName.get());
if (state != null && state.needsRefresh(clock.read())) {
byName.invalidate(projectName.get());
state = byName.get(projectName.get());
}
return state;
} catch (ExecutionException e) {
log.warn(String.format("Cannot read project %s", projectName.get()), e);
return null;
}
return state;
}
/** Invalidate the cached information about the given project. */
public void evict(final Project p) {
if (p != null) {
byName.remove(p.getNameKey());
byName.invalidate(p.getNameKey().get());
}
}
@ -118,10 +132,11 @@ public class ProjectCacheImpl implements ProjectCache {
public void remove(final Project p) {
listLock.lock();
try {
SortedSet<Project.NameKey> n = list.get(ListKey.ALL);
n = new TreeSet<Project.NameKey>(n);
SortedSet<Project.NameKey> n = Sets.newTreeSet(list.get(ListKey.ALL));
n.remove(p.getNameKey());
list.put(ListKey.ALL, Collections.unmodifiableSortedSet(n));
} catch (ExecutionException e) {
log.warn("Cannot list avaliable projects", e);
} finally {
listLock.unlock();
}
@ -132,10 +147,11 @@ public class ProjectCacheImpl implements ProjectCache {
public void onCreateProject(Project.NameKey newProjectName) {
listLock.lock();
try {
SortedSet<Project.NameKey> n = list.get(ListKey.ALL);
n = new TreeSet<Project.NameKey>(n);
SortedSet<Project.NameKey> n = Sets.newTreeSet(list.get(ListKey.ALL));
n.add(newProjectName);
list.put(ListKey.ALL, Collections.unmodifiableSortedSet(n));
} catch (ExecutionException e) {
log.warn("Cannot list avaliable projects", e);
} finally {
listLock.unlock();
}
@ -143,18 +159,28 @@ public class ProjectCacheImpl implements ProjectCache {
@Override
public Iterable<Project.NameKey> all() {
return list.get(ListKey.ALL);
try {
return list.get(ListKey.ALL);
} catch (ExecutionException e) {
log.warn("Cannot list available projects", e);
return Collections.emptyList();
}
}
@Override
public Iterable<Project.NameKey> byName(final String pfx) {
final Iterable<Project.NameKey> src;
try {
src = list.get(ListKey.ALL).tailSet(new Project.NameKey(pfx));
} catch (ExecutionException e) {
return Collections.emptyList();
}
return new Iterable<Project.NameKey>() {
@Override
public Iterator<Project.NameKey> iterator() {
return new Iterator<Project.NameKey>() {
private Iterator<Project.NameKey> itr = src.iterator();
private Project.NameKey next;
private Iterator<Project.NameKey> itr =
list.get(ListKey.ALL).tailSet(new Project.NameKey(pfx)).iterator();
@Override
public boolean hasNext() {
@ -196,7 +222,7 @@ public class ProjectCacheImpl implements ProjectCache {
};
}
static class Loader extends EntryCreator<Project.NameKey, ProjectState> {
static class Loader extends CacheLoader<String, ProjectState> {
private final ProjectState.Factory projectStateFactory;
private final GitRepositoryManager mgr;
@ -207,19 +233,15 @@ public class ProjectCacheImpl implements ProjectCache {
}
@Override
public ProjectState createEntry(Project.NameKey key) throws Exception {
public ProjectState load(String projectName) throws Exception {
Project.NameKey key = new Project.NameKey(projectName);
Repository git = mgr.openRepository(key);
try {
Repository git = mgr.openRepository(key);
try {
final ProjectConfig cfg = new ProjectConfig(key);
cfg.load(git);
return projectStateFactory.create(cfg);
} finally {
git.close();
}
} catch (RepositoryNotFoundException notFound) {
return null;
ProjectConfig cfg = new ProjectConfig(key);
cfg.load(git);
return projectStateFactory.create(cfg);
} finally {
git.close();
}
}
}
@ -231,7 +253,7 @@ public class ProjectCacheImpl implements ProjectCache {
}
}
static class Lister extends EntryCreator<ListKey, SortedSet<Project.NameKey>> {
static class Lister extends CacheLoader<ListKey, SortedSet<Project.NameKey>> {
private final GitRepositoryManager mgr;
@Inject
@ -240,7 +262,7 @@ public class ProjectCacheImpl implements ProjectCache {
}
@Override
public SortedSet<Project.NameKey> createEntry(ListKey key) throws Exception {
public SortedSet<Project.NameKey> load(ListKey key) throws Exception {
return mgr.list();
}
}

View File

@ -14,14 +14,13 @@
package com.google.gerrit.server.project;
import com.google.common.cache.Cache;
import com.google.gerrit.common.data.AccessSection;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.util.MostSpecificComparator;
import com.google.inject.Inject;
import com.google.inject.Module;
import com.google.inject.Singleton;
import com.google.inject.TypeLiteral;
import com.google.inject.name.Named;
import java.util.Arrays;
@ -38,9 +37,7 @@ public class SectionSortCache {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<EntryKey, EntryVal>> type =
new TypeLiteral<Cache<EntryKey, EntryVal>>() {};
core(type, CACHE_NAME);
cache(CACHE_NAME, EntryKey.class, EntryVal.class);
bind(SectionSortCache.class);
}
};
@ -60,7 +57,7 @@ public class SectionSortCache {
}
EntryKey key = new EntryKey(ref, sections);
EntryVal val = cache.get(key);
EntryVal val = cache.getIfPresent(key);
if (val != null) {
int[] srcIdx = val.order;
if (srcIdx != null) {

View File

@ -31,6 +31,7 @@ import com.google.gerrit.server.git.GitRepositoryManager;
import com.google.gerrit.server.patch.PatchList;
import com.google.gerrit.server.patch.PatchListCache;
import com.google.gerrit.server.patch.PatchListEntry;
import com.google.gerrit.server.patch.PatchListNotAvailableException;
import com.google.gerrit.server.project.ChangeControl;
import com.google.gwtorm.server.OrmException;
import com.google.gwtorm.server.ResultSet;
@ -142,7 +143,14 @@ public class ChangeData {
return null;
}
PatchList p = cache.get(c, ps);
PatchList p;
try {
p = cache.get(c, ps);
} catch (PatchListNotAvailableException e) {
currentFiles = new String[0];
return currentFiles;
}
List<String> r = new ArrayList<String>(p.getPatches().size());
for (PatchListEntry e : p.getPatches()) {
if (Patch.COMMIT_MSG.equals(e.getNewName())) {

View File

@ -20,6 +20,8 @@ import static com.google.gerrit.common.data.Permission.PUSH;
import static com.google.gerrit.common.data.Permission.READ;
import static com.google.gerrit.common.data.Permission.SUBMIT;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;
import com.google.common.collect.Lists;
import com.google.gerrit.common.data.Capable;
import com.google.gerrit.common.data.GroupReference;
@ -36,7 +38,6 @@ import com.google.gerrit.server.CurrentUser;
import com.google.gerrit.server.account.CapabilityControl;
import com.google.gerrit.server.account.GroupMembership;
import com.google.gerrit.server.account.ListGroupMembership;
import com.google.gerrit.server.cache.ConcurrentHashMapCache;
import com.google.gerrit.server.config.AllProjectsName;
import com.google.gerrit.server.config.FactoryModule;
import com.google.gerrit.server.config.GerritServerConfig;
@ -321,10 +322,9 @@ public class RefControlTest extends TestCase {
local.createInMemory();
local.getProject().setParentName(parent.getProject().getName());
sectionSorter =
new PermissionCollection.Factory(
new SectionSortCache(
new ConcurrentHashMapCache<SectionSortCache.EntryKey, SectionSortCache.EntryVal>()));
Cache<SectionSortCache.EntryKey, SectionSortCache.EntryVal> c =
CacheBuilder.newBuilder().build();
sectionSorter = new PermissionCollection.Factory(new SectionSortCache(c));
}
private static void assertOwner(String ref, ProjectControl u) {

View File

@ -67,7 +67,7 @@ limitations under the License.
<dependency>
<groupId>com.google.gerrit</groupId>
<artifactId>gerrit-ehcache</artifactId>
<artifactId>gerrit-cache-h2</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>

View File

@ -16,13 +16,13 @@ package com.google.gerrit.sshd;
import static com.google.gerrit.reviewdb.client.AccountExternalId.SCHEME_USERNAME;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.gerrit.common.errors.InvalidSshKeyException;
import com.google.gerrit.reviewdb.client.AccountExternalId;
import com.google.gerrit.reviewdb.client.AccountSshKey;
import com.google.gerrit.reviewdb.server.ReviewDb;
import com.google.gerrit.server.cache.Cache;
import com.google.gerrit.server.cache.CacheModule;
import com.google.gerrit.server.cache.EntryCreator;
import com.google.gerrit.server.ssh.SshKeyCache;
import com.google.gwtorm.server.OrmException;
import com.google.gwtorm.server.SchemaFactory;
@ -42,6 +42,7 @@ import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.concurrent.ExecutionException;
/** Provides the {@link SshKeyCacheEntry}. */
@Singleton
@ -57,9 +58,10 @@ public class SshKeyCacheImpl implements SshKeyCache {
return new CacheModule() {
@Override
protected void configure() {
final TypeLiteral<Cache<String, Iterable<SshKeyCacheEntry>>> type =
new TypeLiteral<Cache<String, Iterable<SshKeyCacheEntry>>>() {};
core(type, CACHE_NAME).populateWith(Loader.class);
cache(CACHE_NAME,
String.class,
new TypeLiteral<Iterable<SshKeyCacheEntry>>(){})
.loader(Loader.class);
bind(SshKeyCacheImpl.class);
bind(SshKeyCache.class).to(SshKeyCacheImpl.class);
}
@ -71,20 +73,27 @@ public class SshKeyCacheImpl implements SshKeyCache {
.asList(new SshKeyCacheEntry[0]));
}
private final Cache<String, Iterable<SshKeyCacheEntry>> cache;
private final LoadingCache<String, Iterable<SshKeyCacheEntry>> cache;
@Inject
SshKeyCacheImpl(
@Named(CACHE_NAME) final Cache<String, Iterable<SshKeyCacheEntry>> cache) {
@Named(CACHE_NAME) LoadingCache<String, Iterable<SshKeyCacheEntry>> cache) {
this.cache = cache;
}
public Iterable<SshKeyCacheEntry> get(String username) {
return cache.get(username);
Iterable<SshKeyCacheEntry> get(String username) {
try {
return cache.get(username);
} catch (ExecutionException e) {
log.warn("Cannot load SSH keys for " + username, e);
return Collections.emptyList();
}
}
public void evict(String username) {
cache.remove(username);
if (username != null) {
cache.invalidate(username);
}
}
@Override
@ -107,7 +116,7 @@ public class SshKeyCacheImpl implements SshKeyCache {
}
}
static class Loader extends EntryCreator<String, Iterable<SshKeyCacheEntry>> {
static class Loader extends CacheLoader<String, Iterable<SshKeyCacheEntry>> {
private final SchemaFactory<ReviewDb> schema;
@Inject
@ -116,8 +125,7 @@ public class SshKeyCacheImpl implements SshKeyCache {
}
@Override
public Iterable<SshKeyCacheEntry> createEntry(String username)
throws Exception {
public Iterable<SshKeyCacheEntry> load(String username) throws Exception {
final ReviewDb db = schema.open();
try {
final AccountExternalId.Key key =
@ -143,11 +151,6 @@ public class SshKeyCacheImpl implements SshKeyCache {
}
}
@Override
public Iterable<SshKeyCacheEntry> missing(String username) {
return Collections.emptyList();
}
private void add(ReviewDb db, List<SshKeyCacheEntry> kl, AccountSshKey k) {
try {
kl.add(new SshKeyCacheEntry(k.getKey(), SshUtil.parse(k)));

View File

@ -17,7 +17,6 @@ package com.google.gerrit.sshd.commands;
import com.google.gerrit.common.errors.PermissionDeniedException;
import com.google.gerrit.server.git.BanCommit;
import com.google.gerrit.server.git.BanCommitResult;
import com.google.gerrit.server.git.IncompleteUserInfoException;
import com.google.gerrit.server.git.MergeException;
import com.google.gerrit.server.project.ProjectControl;
import com.google.gerrit.sshd.SshCommand;
@ -77,8 +76,6 @@ public class BanCommitCommand extends SshCommand {
throw die(e);
} catch (IOException e) {
throw die(e);
} catch (IncompleteUserInfoException e) {
throw die(e);
} catch (MergeException e) {
throw die(e);
} catch (InterruptedException e) {

View File

@ -14,37 +14,33 @@
package com.google.gerrit.sshd.commands;
import com.google.gerrit.ehcache.EhcachePoolImpl;
import com.google.common.cache.Cache;
import com.google.common.collect.Sets;
import com.google.gerrit.extensions.registration.DynamicMap;
import com.google.gerrit.sshd.SshCommand;
import com.google.inject.Inject;
import net.sf.ehcache.CacheManager;
import net.sf.ehcache.Ehcache;
import java.util.Arrays;
import java.util.SortedSet;
import java.util.TreeSet;
abstract class CacheCommand extends SshCommand {
@Inject
protected EhcachePoolImpl cachePool;
protected DynamicMap<Cache<?, ?>> cacheMap;
protected SortedSet<String> cacheNames() {
final SortedSet<String> names = new TreeSet<String>();
for (final Ehcache c : getAllCaches()) {
names.add(c.getName());
SortedSet<String> names = Sets.newTreeSet();
for (String plugin : cacheMap.plugins()) {
for (String name : cacheMap.byPlugin(plugin).keySet()) {
names.add(cacheNameOf(plugin, name));
}
}
return names;
}
protected Ehcache[] getAllCaches() {
final CacheManager cacheMgr = cachePool.getCacheManager();
final String[] cacheNames = cacheMgr.getCacheNames();
Arrays.sort(cacheNames);
final Ehcache[] r = new Ehcache[cacheNames.length];
for (int i = 0; i < cacheNames.length; i++) {
r[i] = cacheMgr.getEhcache(cacheNames[i]);
protected String cacheNameOf(String plugin, String name) {
if ("gerrit".equals(plugin)) {
return name;
} else {
return plugin + "." + name;
}
return r;
}
}

View File

@ -14,18 +14,19 @@
package com.google.gerrit.sshd.commands;
import com.google.common.cache.Cache;
import com.google.gerrit.common.data.GlobalCapability;
import com.google.gerrit.server.IdentifiedUser;
import com.google.gerrit.sshd.BaseCommand;
import com.google.gerrit.sshd.RequiresCapability;
import com.google.inject.Inject;
import net.sf.ehcache.Ehcache;
import com.google.inject.Provider;
import org.kohsuke.args4j.Option;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.SortedSet;
/** Causes the caches to purge all entries and reload. */
@ -95,13 +96,16 @@ final class FlushCaches extends CacheCommand {
private void doBulkFlush() {
try {
for (final Ehcache c : getAllCaches()) {
final String name = c.getName();
if (flush(name)) {
try {
c.removeAll();
} catch (Throwable e) {
stderr.println("error: cannot flush cache \"" + name + "\": " + e);
for (String plugin : cacheMap.plugins()) {
for (Map.Entry<String, Provider<Cache<?, ?>>> entry :
cacheMap.byPlugin(plugin).entrySet()) {
String n = cacheNameOf(plugin, entry.getKey());
if (flush(n)) {
try {
entry.getValue().get().invalidateAll();
} catch (Throwable err) {
stderr.println("error: cannot flush cache \"" + n + "\": " + err);
}
}
}
}

View File

@ -14,22 +14,24 @@
package com.google.gerrit.sshd.commands;
import com.google.common.cache.Cache;
import com.google.common.cache.CacheStats;
import com.google.common.collect.Maps;
import com.google.gerrit.common.Version;
import com.google.gerrit.common.data.GlobalCapability;
import com.google.gerrit.extensions.events.LifecycleListener;
import com.google.gerrit.server.cache.h2.H2CacheImpl;
import com.google.gerrit.server.config.SitePath;
import com.google.gerrit.server.git.WorkQueue;
import com.google.gerrit.server.git.WorkQueue.Task;
import com.google.gerrit.sshd.RequiresCapability;
import com.google.gerrit.sshd.SshDaemon;
import com.google.inject.Inject;
import net.sf.ehcache.Ehcache;
import net.sf.ehcache.Statistics;
import net.sf.ehcache.config.CacheConfiguration;
import com.google.inject.Provider;
import org.apache.mina.core.service.IoAcceptor;
import org.apache.mina.core.session.IoSession;
import org.apache.sshd.server.Environment;
import org.eclipse.jgit.storage.file.WindowCacheStatAccessor;
import org.kohsuke.args4j.Option;
@ -43,6 +45,8 @@ import java.net.UnknownHostException;
import java.text.SimpleDateFormat;
import java.util.Collection;
import java.util.Date;
import java.util.Map;
import java.util.SortedMap;
/** Show the current cache states. */
@RequiresCapability(GlobalCapability.VIEW_CACHES)
@ -76,8 +80,26 @@ final class ShowCaches extends CacheCommand {
@SitePath
private File sitePath;
@Option(name = "--width", aliases = {"-w"}, metaVar = "COLS", usage = "width of output table")
private int columns = 80;
private int nw;
@Override
public void start(Environment env) throws IOException {
String s = env.getEnv().get(Environment.ENV_COLUMNS);
if (s != null && !s.isEmpty()) {
try {
columns = Integer.parseInt(s);
} catch (NumberFormatException err) {
columns = 80;
}
}
super.start(env);
}
@Override
protected void run() {
nw = columns - 50;
Date now = new Date();
stdout.format(
"%-25s %-20s now %16s\n",
@ -91,60 +113,46 @@ final class ShowCaches extends CacheCommand {
stdout.print('\n');
stdout.print(String.format(//
"%1s %-18s %-4s|%-20s| %-5s |%-14s|\n" //
"%1s %-"+nw+"s|%-21s| %-5s |%-9s|\n" //
, "" //
, "Name" //
, "Max" //
, "Object Count" //
, "Entries" //
, "AvgGet" //
, "Hit Ratio" //
));
stdout.print(String.format(//
"%1s %-18s %-4s|%6s %6s %6s| %-5s |%-4s %-4s %-4s|\n" //
"%1s %-"+nw+"s|%6s %6s %7s| %-5s |%-4s %-4s|\n" //
, "" //
, "" //
, "Age" //
, "Disk" //
, "Mem" //
, "Cnt" //
, "" //
, "Disk" //
, "Space" //
, "" //
, "Mem" //
, "Agg" //
, "Disk" //
));
stdout.print("------------------"
+ "-------+--------------------+----------+--------------+\n");
for (final Ehcache cache : getAllCaches()) {
final CacheConfiguration cfg = cache.getCacheConfiguration();
final boolean useDisk = cfg.isDiskPersistent() || cfg.isOverflowToDisk();
final Statistics stat = cache.getStatistics();
final long total = stat.getCacheHits() + stat.getCacheMisses();
stdout.print("--");
for (int i = 0; i < nw; i++) {
stdout.print('-');
}
stdout.print("+---------------------+---------+---------+\n");
if (useDisk) {
stdout.print(String.format(//
"D %-18s %-4s|%6s %6s %6s| %7s |%4s %4s %4s|\n" //
, cache.getName() //
, interval(cfg.getTimeToLiveSeconds()) //
, count(stat.getDiskStoreObjectCount()) //
, count(stat.getMemoryStoreObjectCount()) //
, count(stat.getObjectCount()) //
, duration(stat.getAverageGetTime()) //
, percent(stat.getOnDiskHits(), total) //
, percent(stat.getInMemoryHits(), total) //
, percent(stat.getCacheHits(), total) //
));
} else {
stdout.print(String.format(//
" %-18s %-4s|%6s %6s %6s| %7s |%4s %4s %4s|\n" //
, cache.getName() //
, interval(cfg.getTimeToLiveSeconds()) //
, "", "" //
, count(stat.getObjectCount()) //
, duration(stat.getAverageGetTime()) //
, "", "" //
, percent(stat.getCacheHits(), total) //
));
}
Map<String, H2CacheImpl<?, ?>> disks = Maps.newTreeMap();
printMemoryCaches(disks, sortedCoreCaches());
printMemoryCaches(disks, sortedPluginCaches());
for (Map.Entry<String, H2CacheImpl<?, ?>> entry : disks.entrySet()) {
H2CacheImpl<?, ?> cache = entry.getValue();
CacheStats stat = cache.stats();
H2CacheImpl.DiskStats disk = cache.diskStats();
stdout.print(String.format(
"D %-"+nw+"s|%6s %6s %7s| %7s |%4s %4s|\n",
entry.getKey(),
count(cache.size()),
count(disk.size()),
bytes(disk.space()),
duration(stat.averageLoadPenalty()),
percent(stat.hitCount(), stat.requestCount()),
percent(disk.hitCount(), disk.requestCount())));
}
stdout.print('\n');
@ -165,6 +173,51 @@ final class ShowCaches extends CacheCommand {
stdout.flush();
}
private void printMemoryCaches(
Map<String, H2CacheImpl<?, ?>> disks,
Map<String, Cache<?,?>> caches) {
for (Map.Entry<String, Cache<?,?>> entry : caches.entrySet()) {
Cache<?,?> cache = entry.getValue();
if (cache instanceof H2CacheImpl) {
disks.put(entry.getKey(), (H2CacheImpl<?,?>)cache);
continue;
}
CacheStats stat = cache.stats();
stdout.print(String.format(
" %-"+nw+"s|%6s %6s %7s| %7s |%4s %4s|\n",
entry.getKey(),
count(cache.size()),
"",
"",
duration(stat.averageLoadPenalty()),
percent(stat.hitCount(), stat.requestCount()),
""));
}
}
private Map<String, Cache<?, ?>> sortedCoreCaches() {
SortedMap<String, Cache<?, ?>> m = Maps.newTreeMap();
for (Map.Entry<String, Provider<Cache<?, ?>>> entry :
cacheMap.byPlugin("gerrit").entrySet()) {
m.put(cacheNameOf("gerrit", entry.getKey()), entry.getValue().get());
}
return m;
}
private Map<String, Cache<?, ?>> sortedPluginCaches() {
SortedMap<String, Cache<?, ?>> m = Maps.newTreeMap();
for (String plugin : cacheMap.plugins()) {
if ("gerrit".equals(plugin)) {
continue;
}
for (Map.Entry<String, Provider<Cache<?, ?>>> entry :
cacheMap.byPlugin(plugin).entrySet()) {
m.put(cacheNameOf(plugin, entry.getKey()), entry.getValue().get());
}
}
return m;
}
private void memSummary() {
final Runtime r = Runtime.getRuntime();
final long mMax = r.maxMemory();
@ -300,45 +353,24 @@ final class ShowCaches extends CacheCommand {
return String.format("%6d", cnt);
}
private String duration(double ms) {
if (Math.abs(ms) <= 0.05) {
private String duration(double ns) {
if (ns < 0.5) {
return "";
}
String suffix = "ms";
if (ms >= 1000) {
ms /= 1000;
String suffix = "ns";
if (ns >= 1000.0) {
ns /= 1000.0;
suffix = "us";
}
if (ns >= 1000.0) {
ns /= 1000.0;
suffix = "ms";
}
if (ns >= 1000.0) {
ns /= 1000.0;
suffix = "s ";
}
return String.format("%4.1f%s", ms, suffix);
}
private String interval(double ttl) {
if (ttl == 0) {
return "inf";
}
String suffix = "s";
if (ttl >= 60) {
ttl /= 60;
suffix = "m";
if (ttl >= 60) {
ttl /= 60;
suffix = "h";
}
if (ttl >= 24) {
ttl /= 24;
suffix = "d";
if (ttl >= 365) {
ttl /= 365;
suffix = "y";
}
}
}
return Integer.toString((int) ttl) + suffix;
return String.format("%4.1f%s", ns, suffix);
}
private String percent(final long value, final long total) {

View File

@ -18,12 +18,12 @@ import static com.google.inject.Scopes.SINGLETON;
import static com.google.inject.Stage.PRODUCTION;
import com.google.gerrit.common.ChangeHookRunner;
import com.google.gerrit.ehcache.EhcachePoolImpl;
import com.google.gerrit.httpd.auth.openid.OpenIdModule;
import com.google.gerrit.httpd.plugins.HttpPluginModule;
import com.google.gerrit.lifecycle.LifecycleManager;
import com.google.gerrit.lifecycle.LifecycleModule;
import com.google.gerrit.reviewdb.client.AuthType;
import com.google.gerrit.server.cache.h2.DefaultCacheFactory;
import com.google.gerrit.server.config.AuthConfig;
import com.google.gerrit.server.config.AuthConfigModule;
import com.google.gerrit.server.config.CanonicalWebUrlModule;
@ -200,7 +200,7 @@ public class WebAppInitializer extends GuiceServletContextListener {
modules.add(new ChangeHookRunner.Module());
modules.add(new ReceiveCommitsExecutorModule());
modules.add(cfgInjector.getInstance(GerritGlobalModule.class));
modules.add(new EhcachePoolImpl.Module());
modules.add(new DefaultCacheFactory.Module());
modules.add(new SmtpEmailSender.Module());
modules.add(new SignedTokenEmailTokenVerifier.Module());
modules.add(new PluginModule());

View File

@ -48,10 +48,6 @@ log4j.logger.org.openid4java.discovery.Discovery=ERROR
log4j.logger.org.openid4java.server.RealmVerifier=ERROR
log4j.logger.org.openid4java.message.AuthSuccess=ERROR
# Silence non-critical messages from ehcache
#
log4j.logger.net.sf.ehcache=WARN
# Silence non-critical messages from c3p0 (if used).
#
log4j.logger.com.mchange.v2.c3p0=WARN

14
pom.xml
View File

@ -74,7 +74,7 @@ limitations under the License.
<module>gerrit-antlr</module>
<module>gerrit-common</module>
<module>gerrit-ehcache</module>
<module>gerrit-cache-h2</module>
<module>gerrit-httpd</module>
<module>gerrit-launcher</module>
<module>gerrit-main</module>
@ -460,6 +460,12 @@ limitations under the License.
<version>2.1</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>12.0</version>
</dependency>
<dependency>
<groupId>gwtorm</groupId>
<artifactId>gwtorm</artifactId>
@ -552,12 +558,6 @@ limitations under the License.
<version>1.6.4</version>
</dependency>
<dependency>
<groupId>net.sf.ehcache</groupId>
<artifactId>ehcache-core</artifactId>
<version>1.7.2</version>
</dependency>
<dependency>
<groupId>args4j</groupId>
<artifactId>args4j</artifactId>